In the article below, we will (i) automatically find the Option (of choice) closest to At The Money (ATM) and (ii) calculate its Implied Volatility and Greeks. We focus below on Future (Monthly) Options on the Index .STOXX50E (EURO STOXX 50 EUR PRICE INDEX) ('EUREX') and .SPX (S&P 500 INDEX), although you can apply the logic below for another index. To find the ATM instrument, we simply and efficiently use the Search API. Usually, the calculation of the Black-Scholes-Merton model's Implied Volatility involves numerical techniques, since it is not a closed equation (unless restricting assumptions that log returns follow a standard normal distribution with mean is zero, $\mu$ = 0, and standard deviation is zero, $\sigma$ = 1, are made). If we used these techniques in calculating each Implied Volatility value on our computer, it would take several seconds - if not minutes - for each data point computed. I have chosen to use the Instrument Pricing Analytics (IPA) service in the Refinitiv Data Platform API Family instead, as this service allows me to send model specifications (and variables) and receive several (up to 100) computed Implied Volatility values in one go - in a few seconds. Not only does this save a great deal of time, but also many lines of code!
import refinitiv.data as rd # This is LSEG's Data and Analytics' API wrapper, called the Refinitiv Data Library for Python.
from refinitiv.data.content import historical_pricing # We will use this Python Class in `rd` to show the Implied Volatility data already available before our work.
from refinitiv.data.content import search # We will use this Python Class in `rd` to fid the instrument we are after, closest to At The Money.
import refinitiv.data.content.ipa.financial_contracts as rdf # We're going to need thtis to use the content layer of the RD library and the calculators of greeks and Impl Volat in Instrument Pricing Analytics (IPA) and Exchange Traded Instruments (ETI)
from refinitiv.data.content.ipa.financial_contracts import option # We're going to need thtis to use the content layer of the RD library and the calculators of greeks and Impl Volat in IPA & ETI
import numpy as np # We need `numpy` for mathematical and array manipilations.
import pandas as pd # We need `pandas` for datafame and array manipilations.
import calendar # We use `calendar` to identify holidays and maturity dates of intruments of interest.
import pytz # We use `pytz` to manipulate time values aiding `calendar` library. to import its types, you might need to run `!python3 -m pip install types-pytz`
import pandas_market_calendars as mcal # Used to identify holidays. See `https://github.com/rsheftel/pandas_market_calendars/blob/master/examples/usage.ipynb` for info on this market calendar library
from datetime import datetime, timedelta, timezone # We use these to manipulate time values
from dateutil.relativedelta import relativedelta # We use `relativedelta` to manipulate time values aiding `calendar` library.
import requests # We'll need this to send requests to servers vie a the delivery layer - more on that below
# `plotly` is a library used to render interactive graphs:
import plotly.graph_objects as go
import plotly.express as px # This is just to see the implied vol graph when that field is available
import matplotlib.pyplot as plt # We use `matplotlib` to just in case users do not have an environment suited to `plotly`.
from IPython.display import clear_output, display # We use `clear_output` for users who wish to loop graph production on a regular basis. We'll use this to `display` data (e.g.: pandas data-frames).
from plotly import subplots
import plotly
# Let's authenticate ourseves to LSEG's Data and Analytics service, Refinitiv:
try: # The following libraries are not available in Codebook, thus this try loop
rd.open_session(config_name="C:\\Example.DataLibrary.Python-main\\Example.DataLibrary.Python-main\\Configuration\\refinitiv-data.config.json")
rd.open_session("desktop.workspace")
except:
rd.open_session()
mcal.__version__
'4.1.0'
print(f"Here we are using the refinitiv Data Library version {rd.__version__}")
Here we are using the refinitiv Data Library version 1.0.0b24
FYI (For Your Information): We are running Python 3.8:
!python -V
Python 3.8.2
In this article, we will attempt to calculate the Implied Volatility (IV) for Future Options on 2 indexes (.STOXX50E & .SPX) trading 'ATM', meaning that the contract's strike price is at (or near - within x%) parity with (equal to) its current treading price (TRDPRC_1). We are also only looking for such Options expiring within a set time window; allowing for the option 'forever', i.e.: that expire whenever after date of calculation. To do so, we 1st have to find the option in question. To find live Options, we best use the Search API. To find Expired Options we will use functions created in Haykaz's amazing articles "Finding Expired Options and Backtesting a Short Iron Condor Strategy" & "Functions to find Option RICs traded on different exchanges"
Live Options, in this context, are Options that have not expired at time of computation. To be explicit:
As aforementioned, to find live Options, we best use the Search API: Here we look for options on .STOXX50E that mature on the 3rd friday of July 2023, 2023-07-21:
response1 = search.Definition(
view = search.Views.SEARCH_ALL, # To see what views are available: `help(search.Views)` & `search.metadata.Definition(view = search.Views.SEARCH_ALL).get_data().data.df.to_excel("SEARCH_ALL.xlsx")`
query=".STOXX50E",
select="DocumentTitle, RIC, StrikePrice, ExchangeCode, ExpiryDate, UnderlyingAsset, " +
"UnderlyingAssetName, UnderlyingAssetRIC, ESMAUnderlyingIndexCode, RCSUnderlyingMarket" +
"UnderlyingQuoteName, UnderlyingQuoteRIC, InsertDateTime, RetireDate",
filter="RCSAssetCategoryLeaf eq 'Option' and RIC eq 'STX*' and DocumentTitle ne '*Weekly*' " +
"and CallPutOption eq 'Call' and ExchangeCode eq 'EUX' and " +
"ExpiryDate ge 2022-07-10 and ExpiryDate lt 2023-07-22", # ge (greater than or equal to), gt (greater than), lt (less than) and le (less than or equal to). These can only be applied to numeric and date properties.
top=100).get_data()
searchDf1 = response1.data.df
searchDf1
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 0 | Eurex EURO STOXX 50 Index Option 4200 Call Apr... | STXE42000D3.EX | 4200 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:00:21 | 2023-04-25 |
| 1 | Eurex EURO STOXX 50 Index Option 4100 Call Apr... | STXE41000D3.EX | 4100 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:59:34 | 2023-04-25 |
| 2 | Eurex EURO STOXX 50 Index Option 4000 Call Apr... | STXE40000D3.EX | 4000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:55:59 | 2023-04-25 |
| 3 | Eurex EURO STOXX 50 Index Option 4300 Call Apr... | STXE43000D3.EX | 4300 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:59:58 | 2023-04-25 |
| 4 | Eurex EURO STOXX 50 Index Option 4000 Call Jun... | STXE40000F3.EX | 4000 | EUX | 2023-06-16 | [.STOXX50E] | 2023-03-09 03:59:37 | 2023-06-20 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 95 | Eurex EURO STOXX 50 Index Option 4375 Call Jun... | STXE43750F3.EX | 4375 | EUX | 2023-06-16 | [.STOXX50E] | 2023-03-09 04:05:43 | 2023-06-20 |
| 96 | Eurex EURO STOXX 50 Index Option 3400 Call Apr... | STXE34000D3.EX | 3400 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:47:24 | 2023-04-25 |
| 97 | Eurex EURO STOXX 50 Index Option 3000 Call Apr... | STXE30000D3.EX | 3000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:01:09 | 2023-04-25 |
| 98 | Eurex EURO STOXX 50 Index Option 2800 Call Apr... | STXE28000D3.EX | 2800 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:25:58 | 2023-04-25 |
| 99 | Eurex EURO STOXX 50 Index Option 4850 Call Apr... | STXE48500D3.EX | 4850 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:12:59 | 2023-04-25 |
100 rows × 8 columns
Let's say the current underlying price is 3331.7EUR, now we can pick the option with strike price closest to that, i.e.: the most 'At The Money'; note that this means that the option can be in or out the money, as long as it is the closest to at the money:
currentUnderlyingPrc = rd.get_history(
universe=[searchDf1.UnderlyingQuoteRIC[0][0]],
fields=["TRDPRC_1"],
interval="tick").iloc[-1][0]
currentUnderlyingPrc
4215.43
searchDf1.iloc[(searchDf1['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]]
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 17 | Eurex EURO STOXX 50 Index Option 4225 Call Apr... | STXE42250D3.EX | 4225 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:50:17 | 2023-04-25 |
In this instance, for this Call Option, 'STXE33500G3.EX', the strike price is 3350, higher than the spot price of our underlying which is 3331.7. The holder of this 'STXE33500G3.EX' option has the right (but not the obligation) to buy the underlying for 3350EUR, which, was the price of the underlying to stay the same till expiry (3331.7EUR on 2023-07-21), means a loss of (3350 - 3331.7 =) 18.3EUR. This option in this instance is 'Out-The-Money'.
N.B.: When using the Filter in Search and playing with dates, it is good to read the API Playground Documentation; it mentions that: "Dates are written in ISO datetime format. The time portion is optional, as is the timezone (assumed to be UTC unless otherwise specified). Valid examples include 2012-03-11T17\:13:55Z, 2012-03-11T17\:13:55, 2012-03-11T12\:00-03:30, 2012-03-11.":
Most of the time, market agents will be interested in the next expiring Option, unless we are too close to it. We would not be interested, for example, in an option expiring in 1 hour, or even tomorrow, because that is so close (in time) that the information reflected in the Option's trades in the market does not represent future expectations of its underlying, but current expectations of it.
To implement such a logic, we need to know what are the expiry dates of the option that we are interested in. We are looking for a Python function narrowing our search to options expiring on the 3rd Friday of any one month. For info on this function, please read articles "Finding Expired Options and Backtesting a Short Iron Condor Strategy" & "Functions to find Option RICs traded on different exchanges"
def Get_exp_dates(year, days=True, mcal_get_calendar='EUREX'):
'''
Get_exp_dates Version 2.0:
This function gets expiration dates for a year for NDX options, which are the 3rd Fridays of each month.
Changes
----------------------------------------------
Changed from Version 1.0 to 2.0: Jonathan Legrand changed Haykaz Aramyan's original code to allow
(i) function name changed from `get_exp_dates` to `Get_exp_dates`
(ii) for the function's holiday argument to be changed, allowing for any calendar supported by `mcal.get_calendar` and defaulted to 'EUREX' as opposed to 'CBOE_Index_Options' and
(iii) for the function to output full date objects as opposed to just days of the month if agument days=True.
Dependencies
----------------------------------------------
Python library 'pandas_market_calendars' version '3.2'.
pandas_market_calendars as mcal version '4.1.0'.
Parameters
-----------------------------------------------
Input:
year(int): year for which expiration days are requested
mcal_get_calendar(str): String of the calendar for which holidays have to be taken into account. More on this calendar (link to Github checked 2022-10-11): https://github.com/rsheftel/pandas_market_calendars/blob/177e7922c7df5ad249b0d066b5c9e730a3ee8596/pandas_market_calendars/exchange_calendar_cboe.py
Default: mcal_get_calendar='EUREX'
days(bool): If True, only days of the month is outputed, else it's dataeime objects
Default: days=True
Output:
dates(dict): dictionary of expiration days for each month of a specified year in datetime.date format.
'''
# get CBOE market holidays
EUREXCal = mcal.get_calendar(mcal_get_calendar)
holidays = EUREXCal.holidays().holidays
# set calendar starting from Saturday
c = calendar.Calendar(firstweekday=calendar.SATURDAY)
# get the 3rd Friday of each month
exp_dates = {}
for i in range(1, 13):
monthcal = c.monthdatescalendar(year, i)
date = monthcal[2][-1]
# check if found date is an holiday and get the previous date if it is
if date in holidays:
date = date + timedelta(-1)
# append the date to the dictionary
if year in exp_dates:
### Changed from original code from here on by Jonathan Legrand on 2022-10-11
if days: exp_dates[year].append(date.day)
else: exp_dates[year].append(date)
else:
if days: exp_dates[year] = [date.day]
else: exp_dates[year] = [date]
return exp_dates
fullDates = Get_exp_dates(2022, days=False)
dates = Get_exp_dates(2022)
fullDatesStrDict = {i: [fullDates[i][j].strftime('%Y-%m-%d')
for j in range(len(fullDates[i]))]
for i in list(fullDates.keys())}
fullDatesDayDict = {i: [fullDates[i][j].day
for j in range(len(fullDates[i]))]
for i in list(fullDates.keys())}
print(fullDates)
{2022: [datetime.date(2022, 1, 21), datetime.date(2022, 2, 18), datetime.date(2022, 3, 18), datetime.date(2022, 4, 14), datetime.date(2022, 5, 20), datetime.date(2022, 6, 17), datetime.date(2022, 7, 15), datetime.date(2022, 8, 19), datetime.date(2022, 9, 16), datetime.date(2022, 10, 21), datetime.date(2022, 11, 18), datetime.date(2022, 12, 16)]}
print(fullDatesStrDict)
{2022: ['2022-01-21', '2022-02-18', '2022-03-18', '2022-04-14', '2022-05-20', '2022-06-17', '2022-07-15', '2022-08-19', '2022-09-16', '2022-10-21', '2022-11-18', '2022-12-16']}
print(dates)
{2022: [21, 18, 18, 14, 20, 17, 15, 19, 16, 21, 18, 16]}
print(fullDatesDayDict)
{2022: [21, 18, 18, 14, 20, 17, 15, 19, 16, 21, 18, 16]}
Most of the time, market agents will be interested in the next expiring Option, unless we are too close to it. We would not be interested, for example, in an option expiring in 1 hour, or even tomorrow, because that is so close (in time) that the information reflected in the Option's trades in the market does not represent future expectations of its underlying, but current expectations of it.
E.g.: I would like to know what is the next Future (Monthly) Option (i) on the Index '.STOXX50E' (ii) closest to ATM (i.e.: with an underlying spot price closest to the option's strike price) (ii) Expiring in more than x days (i.e.: not too close to calculated time 't'), let's say 15 days:
x = 15
timeOfCalcDatetime = datetime.now() # For now, we will focuss on the use-case where we are calculating values for today; later we will allow for it historically for any day going back a few business days.
timeOfCalcStr = datetime.now().strftime('%Y-%m-%d')
timeOfCalcStr
'2023-03-29'
fullDatesAtTimeOfCalc = Get_exp_dates(timeOfCalcDatetime.year, days=False) # `timeOfCalcDatetime.year` here is 2023
fullDatesAtTimeOfCalcDatetime = [
datetime(i.year, i.month, i.day)
for i in fullDatesAtTimeOfCalc[list(fullDatesAtTimeOfCalc.keys())[0]]]
print(fullDatesAtTimeOfCalcDatetime)
[datetime.datetime(2023, 1, 20, 0, 0), datetime.datetime(2023, 2, 17, 0, 0), datetime.datetime(2023, 3, 17, 0, 0), datetime.datetime(2023, 4, 21, 0, 0), datetime.datetime(2023, 5, 19, 0, 0), datetime.datetime(2023, 6, 16, 0, 0), datetime.datetime(2023, 7, 21, 0, 0), datetime.datetime(2023, 8, 18, 0, 0), datetime.datetime(2023, 9, 15, 0, 0), datetime.datetime(2023, 10, 20, 0, 0), datetime.datetime(2023, 11, 17, 0, 0), datetime.datetime(2023, 12, 15, 0, 0)]
expiryDateOfInt = [i for i in fullDatesAtTimeOfCalcDatetime
if i > timeOfCalcDatetime + relativedelta(days=x)][0]
expiryDateOfInt
datetime.datetime(2023, 4, 21, 0, 0)
Now we can look for the one option we're after:
response2 = search.Definition(
view=search.Views.SEARCH_ALL, # To see what views are available: `help(search.Views)` & `search.metadata.Definition(view = search.Views.SEARCH_ALL).get_data().data.df.to_excel("SEARCH_ALL.xlsx")`
query=".STOXX50E",
select="DocumentTitle, RIC, StrikePrice, ExchangeCode, ExpiryDate, UnderlyingAsset, " +
"UnderlyingAssetName, UnderlyingAssetRIC, ESMAUnderlyingIndexCode, RCSUnderlyingMarket" +
"UnderlyingQuoteName, UnderlyingQuoteRIC, InsertDateTime, RetireDate",
filter="RCSAssetCategoryLeaf eq 'Option' and RIC eq 'STX*' and DocumentTitle ne '*Weekly*' " +
"and CallPutOption eq 'Call' and ExchangeCode eq 'EUX' and " +
f"ExpiryDate ge {(expiryDateOfInt - relativedelta(days=1)).strftime('%Y-%m-%d')} " +
f"and ExpiryDate lt {(expiryDateOfInt + relativedelta(days=1)).strftime('%Y-%m-%d')}", # ge (greater than or equal to), gt (greater than), lt (less than) and le (less than or equal to). These can only be applied to numeric and date properties.
top=10000,
).get_data()
searchDf2 = response2.data.df
searchDf2
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 0 | Eurex EURO STOXX 50 Index Option 4200 Call Apr... | STXE42000D3.EX | 4200 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:00:21 | 2023-04-25 |
| 1 | Eurex EURO STOXX 50 Index Option 4100 Call Apr... | STXE41000D3.EX | 4100 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:59:34 | 2023-04-25 |
| 2 | Eurex EURO STOXX 50 Index Option 4000 Call Apr... | STXE40000D3.EX | 4000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:55:59 | 2023-04-25 |
| 3 | Eurex EURO STOXX 50 Index Option 4300 Call Apr... | STXE43000D3.EX | 4300 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:59:58 | 2023-04-25 |
| 4 | Eurex EURO STOXX 50 Index Option 4125 Call Apr... | STXE41250D3.EX | 4125 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:50:14 | 2023-04-25 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 150 | Eurex EURO STOXX 50 Index Option 1900 Call Apr... | STXE19000D3.EX | 1900 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 04:22:12 | 2023-04-25 |
| 151 | Eurex EURO STOXX 50 Index Option 7000 Call Apr... | STXE70000D3.EX | 7000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 05:50:32 | 2023-04-25 |
| 152 | Eurex EURO STOXX 50 Index Option 8000 Call Apr... | STXE80000D3.EX | 8000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 06:17:00 | 2023-04-25 |
| 153 | Eurex EURO STOXX 50 Index Option 9000 Call Apr... | STXE90000D3.EX | 9000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 06:11:01 | 2023-04-25 |
| 154 | Eurex EURO STOXX 50 Index Option 10000 Call Ap... | STXE100000D3.EX | 10000 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 06:03:00 | 2023-04-25 |
155 rows × 8 columns
And again, we can collect the closest to ATM:
searchDf2.iloc[(searchDf2['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]]
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 13 | Eurex EURO STOXX 50 Index Option 4225 Call Apr... | STXE42250D3.EX | 4225 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:50:17 | 2023-04-25 |
Now we have our instrument:
instrument = searchDf2.iloc[(searchDf2['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]].RIC.values[0]
instrument
'STXE42250D3.EX'
Refinitiv provides pre-calculated Implied Volatility values, but they are daily, and we will look into calculating them in higher frequencies:
## Example Options:
# instrument_1 = 'SPXv212240000.U'
# instrument_2 = 'STXE35500J2.EX' # Eurex Dow Jones EURO STOXX 50 Index Option 3550 Call Oct 2022, Stock Index Cash Option, Underlying RIC: .STOXX50E
# instrument_3 = 'SPXj212240000.U'
datetime.now().isoformat(timespec='minutes')
'2023-03-29T14:48'
start = (timeOfCalcDatetime - pd.tseries.offsets.BDay(5)).strftime('%Y-%m-%dT%H:%M:%S.%f') # '2022-10-05T07:30:00.000'
endDateTime = datetime.now()
end = endDateTime.strftime('%Y-%m-%dT%H:%M:%S.%f') # e.g.: '2022-09-09T20:00:00.000'
end
'2023-03-29T14:48:15.430844'
_RefDailyImpVolDf = historical_pricing.events.Definition(
instrument, fields=['IMP_VOLT'], count=2000).get_data()
_RefDailyImpVolDf.data.df.head()
| STXE42250D3.EX | IMP_VOLT |
|---|---|
| Timestamp | |
| 2022-12-28 00:53:31.808 | 15.6212 |
| 2022-12-29 00:53:36.379 | 15.3931 |
| 2022-12-30 00:53:41.120 | 15.2385 |
| 2022-12-31 00:53:31.819 | 15.3381 |
| 2023-01-03 00:53:39.716 | 15.8512 |
try: RefDailyImpVolDf = _RefDailyImpVolDf.data.df.drop(['EVENT_TYPE'], axis=1) # In codebook, this line is needed
except: RefDailyImpVolDf = _RefDailyImpVolDf.data.df # If outside of codebook
fig = px.line(RefDailyImpVolDf, title = RefDailyImpVolDf.columns.name + " " + RefDailyImpVolDf.columns[0]) # This is just to see the implied vol graph when that field is available
fig.show()
# rd.get_history(
# universe=["STXE35500J2.EX"],
# fields=["TRDPRC_1"],
# interval="tick")
_optnMrktPrice = rd.get_history(
universe=[instrument],
fields=["TRDPRC_1"],
interval="10min",
start=start, # Ought to always start at 4 am for OPRA exchanged Options, more info in the article below
end=end) # Ought to always end at 8 pm for OPRA exchanged Options, more info in the article below
[Error 400 - invalid_grant] empty error description
As you can see, there isn't nessesarily a trade every 10 min.:
_optnMrktPrice.head()
| STXE42250D3.EX | TRDPRC_1 |
|---|---|
| Timestamp | |
| 2023-03-23 10:10:00 | 65.7 |
| 2023-03-23 10:20:00 | 63.6 |
| 2023-03-23 10:30:00 | 63.6 |
| 2023-03-23 11:10:00 | 67.6 |
| 2023-03-23 13:00:00 | 70.0 |
However, for the statistical inferences that we will make further in the article, when we will calculate Implied Volatilities and therefore implement the Black Scholes model, we will need 'continuous timeseries' with which to deal. There are several ways to go from discrete time series (like ours, even if we go down to tick data), but for this article, we will 1st focus on making 'buckets' of 10 min. If no trade is made in any 10 min. bucket, we will assume the price to have stayed the same as previously, throughout the exchange's trading hours which are:
thankfully this is simple. Let's stick with the EUREX for now:
optnMrktPrice = _optnMrktPrice.resample('10Min').mean() # get a datapoint every 10 min
optnMrktPrice = optnMrktPrice[optnMrktPrice.index.strftime('%Y-%m-%d').isin([i for i in _optnMrktPrice.index.strftime('%Y-%m-%d').unique()])] # Only keep trading days
optnMrktPrice = optnMrktPrice.loc[(optnMrktPrice.index.strftime('%H:%M:%S') >= '07:30:00') & (optnMrktPrice.index.strftime('%H:%M:%S') <= '22:00:00')] # Only keep trading hours
optnMrktPrice.fillna(method='ffill', inplace=True) # Forward Fill to populate NaN values
print(f"Our dataframe started at {str(optnMrktPrice.index[0])} and went on continuously till {str(optnMrktPrice.index[-1])}, so out of trading hours rows are removed")
optnMrktPrice
Our dataframe started at 2023-03-23 10:10:00 and went on continuously till 2023-03-29 11:30:00, so out of trading hours rows are removed
| STXE42250D3.EX | TRDPRC_1 |
|---|---|
| Timestamp | |
| 2023-03-23 10:10:00 | 65.7 |
| 2023-03-23 10:20:00 | 63.6 |
| 2023-03-23 10:30:00 | 63.6 |
| 2023-03-23 10:40:00 | 63.6 |
| 2023-03-23 10:50:00 | 63.6 |
| ... | ... |
| 2023-03-29 10:50:00 | 65.9 |
| 2023-03-29 11:00:00 | 65.9 |
| 2023-03-29 11:10:00 | 65.9 |
| 2023-03-29 11:20:00 | 65.9 |
| 2023-03-29 11:30:00 | 68.4 |
361 rows × 1 columns
Note that the option might not have traded in the past 10 min. This can cause issues in the code below, we thus ought to add a row for the current time:
# optnMrktPrice = optnMrktPrice.append(
# pd.DataFrame(
# [[pd.NA]], columns=optnMrktPrice.columns,
# index=[(endDateTime + (datetime.min - endDateTime) % timedelta(minutes=10))]))
# optnMrktPrice
Note also that one may want to only look at 'At Option Trade' datapoints, i.e.: Implied Volatility when a trade is made for the Option, but not when none is made. For this, we will use the 'At Trade' (AT) dataframes:
AToptnMrktPrice = _optnMrktPrice
AToptnMrktPrice
| STXE42250D3.EX | TRDPRC_1 |
|---|---|
| Timestamp | |
| 2023-03-23 10:10:00 | 65.7 |
| 2023-03-23 10:20:00 | 63.6 |
| 2023-03-23 10:30:00 | 63.6 |
| 2023-03-23 11:10:00 | 67.6 |
| 2023-03-23 13:00:00 | 70.0 |
| 2023-03-23 13:40:00 | 68.6 |
| 2023-03-23 13:50:00 | 69.6 |
| 2023-03-23 14:40:00 | 73.2 |
| 2023-03-23 15:20:00 | 78.0 |
| 2023-03-23 16:10:00 | 76.0 |
| 2023-03-23 16:20:00 | 75.2 |
| 2023-03-24 08:00:00 | 67.0 |
| 2023-03-24 09:00:00 | 58.5 |
| 2023-03-24 09:20:00 | 56.7 |
| 2023-03-24 10:20:00 | 55.2 |
| 2023-03-24 10:30:00 | 51.3 |
| 2023-03-24 11:20:00 | 49.6 |
| 2023-03-24 12:10:00 | 48.0 |
| 2023-03-24 12:20:00 | 48.0 |
| 2023-03-24 12:40:00 | 46.6 |
| 2023-03-24 13:40:00 | 49.5 |
| 2023-03-24 13:50:00 | 50.6 |
| 2023-03-24 14:00:00 | 54.0 |
| 2023-03-24 14:30:00 | 48.7 |
| 2023-03-24 15:20:00 | 46.8 |
| 2023-03-24 15:40:00 | 49.5 |
| 2023-03-24 15:50:00 | 49.8 |
| 2023-03-24 16:00:00 | 51.2 |
| 2023-03-24 16:20:00 | 47.2 |
| 2023-03-27 07:20:00 | 59.0 |
| 2023-03-27 07:50:00 | 55.7 |
| 2023-03-27 08:00:00 | 53.4 |
| 2023-03-27 09:50:00 | 58.2 |
| 2023-03-27 10:00:00 | 60.5 |
| 2023-03-27 10:20:00 | 61.1 |
| 2023-03-28 07:40:00 | 63.6 |
| 2023-03-28 08:20:00 | 60.0 |
| 2023-03-28 08:40:00 | 57.4 |
| 2023-03-28 08:50:00 | 58.0 |
| 2023-03-28 11:10:00 | 55.5 |
| 2023-03-28 11:20:00 | 55.5 |
| 2023-03-28 13:50:00 | 50.5 |
| 2023-03-28 14:20:00 | 49.5 |
| 2023-03-28 14:30:00 | 49.0 |
| 2023-03-28 15:20:00 | 51.1 |
| 2023-03-29 09:50:00 | 70.1 |
| 2023-03-29 10:10:00 | 65.9 |
| 2023-03-29 11:30:00 | 68.4 |
Now let's get data for the underying, which we need to calculate IV:
underlying = searchDf2.iloc[(searchDf2['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]].UnderlyingQuoteRIC.values[0][0]
underlying
'.STOXX50E'
If you are interested in the opening times of any one exchange, you can use the following:
hoursDf = rd.get_data(
universe=["EUREX21"],
fields=["ROW80_10"])
display(hoursDf)
hoursDf.iloc[0,1]
| Instrument | ROW80_10 | |
|---|---|---|
| 0 | EUREX21 | OGBL/OGBM/OGBS 07:30-08:00 08:0... |
' OGBL/OGBM/OGBS 07:30-08:00 08:00-19:00 19:00-20:00 '
_underlyingMrktPrice = rd.get_history(
universe=[underlying],
fields=["TRDPRC_1"],
interval="10min",
start=start,
end=end)
_underlyingMrktPrice
| .STOXX50E | TRDPRC_1 |
|---|---|
| Timestamp | |
| 2023-03-22 14:50:00 | 4207.1 |
| 2023-03-22 15:00:00 | 4202.74 |
| 2023-03-22 15:10:00 | 4199.54 |
| 2023-03-22 15:20:00 | 4202.01 |
| 2023-03-22 15:30:00 | 4202.94 |
| ... | ... |
| 2023-03-29 12:00:00 | 4213.68 |
| 2023-03-29 12:10:00 | 4213.48 |
| 2023-03-29 12:20:00 | 4214.88 |
| 2023-03-29 12:30:00 | 4215.13 |
| 2023-03-29 12:40:00 | 4215.61 |
259 rows × 1 columns
ATunderlyingMrktPrice = AToptnMrktPrice.join(
_underlyingMrktPrice, lsuffix='_OptPr', rsuffix='_UnderlyingPr', how='inner')
ATunderlyingMrktPrice.head(2)
| TRDPRC_1_OptPr | TRDPRC_1_UnderlyingPr | |
|---|---|---|
| Timestamp | ||
| 2023-03-23 10:10:00 | 65.7 | 4176.5 |
| 2023-03-23 10:20:00 | 63.6 | 4170.06 |
Let's put it all in one data-frame, df. Some datasets will have data going from the time we sort for start all the way to end. Some won't because no trade happened in the past few minutes/hours. We ought to base ourselves on the dataset with values getting closer to end and ffill for the other column. As a result, the following if loop is needed:
if optnMrktPrice.index[-1] >= _underlyingMrktPrice.index[-1]:
df = optnMrktPrice.copy()
df['underlying ' + underlying + ' TRDPRC_1'] = _underlyingMrktPrice
else:
df = _underlyingMrktPrice.copy()
df.rename(columns={"TRDPRC_1": 'underlying ' + underlying + ' TRDPRC_1'}, inplace=True)
df['TRDPRC_1'] = optnMrktPrice
df.columns.name = optnMrktPrice.columns.name
df.fillna(method='ffill', inplace=True) # Forward Fill to populate NaN values
df = df.dropna()
df
| STXE42250D3.EX | underlying .STOXX50E TRDPRC_1 | TRDPRC_1 |
|---|---|---|
| Timestamp | ||
| 2023-03-23 10:10:00 | 4176.5 | 65.7 |
| 2023-03-23 10:20:00 | 4170.06 | 63.6 |
| 2023-03-23 10:30:00 | 4169.96 | 63.6 |
| 2023-03-23 10:40:00 | 4170.47 | 63.6 |
| 2023-03-23 10:50:00 | 4171.68 | 63.6 |
| ... | ... | ... |
| 2023-03-29 12:00:00 | 4213.68 | 68.4 |
| 2023-03-29 12:10:00 | 4213.48 | 68.4 |
| 2023-03-29 12:20:00 | 4214.88 | 68.4 |
| 2023-03-29 12:30:00 | 4215.13 | 68.4 |
| 2023-03-29 12:40:00 | 4215.61 | 68.4 |
234 rows × 2 columns
strikePrice = searchDf2.iloc[
(searchDf2['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]].StrikePrice.values[0]
strikePrice
4225
_EurRfRate = rd.get_history(
universe=['EURIBOR3MD='], # USD3MFSR=, USDSOFR=
fields=['TR.FIXINGVALUE'],
# Since we will use `dropna()` as a way to select the rows we are after later on in the code, we need to ask for more risk-free data than needed, just in case we don't have enough:
start=(datetime.strptime(start, '%Y-%m-%dT%H:%M:%S.%f') - timedelta(days=1)).strftime('%Y-%m-%d'),
end=(datetime.strptime(end, '%Y-%m-%dT%H:%M:%S.%f') + timedelta(days=1)).strftime('%Y-%m-%d'))
_EurRfRate
| EURIBOR3MD= | Fixing Value |
|---|---|
| Date | |
| 2023-03-29 | 3.015 |
| 2023-03-28 | 2.99 |
| 2023-03-27 | 3.012 |
| 2023-03-24 | 3.025 |
| 2023-03-23 | 2.99 |
| 2023-03-22 | 3.002 |
| 2023-03-21 | 2.908 |
Euribor values are released daily at 11am CET, and it is published as such on Refinitiv:
EurRfRate = _EurRfRate.resample('10Min').mean().fillna(method='ffill')
df['EurRfRate'] = EurRfRate
You might be running your code after the latest Risk Free Rate published, so the most accurate such value after taht would be the latest value, thus the use of ffill:
df = df.fillna(method='ffill')
df
| STXE42250D3.EX | underlying .STOXX50E TRDPRC_1 | TRDPRC_1 | EurRfRate |
|---|---|---|---|
| Timestamp | |||
| 2023-03-23 10:10:00 | 4176.5 | 65.7 | 2.99 |
| 2023-03-23 10:20:00 | 4170.06 | 63.6 | 2.99 |
| 2023-03-23 10:30:00 | 4169.96 | 63.6 | 2.99 |
| 2023-03-23 10:40:00 | 4170.47 | 63.6 | 2.99 |
| 2023-03-23 10:50:00 | 4171.68 | 63.6 | 2.99 |
| ... | ... | ... | ... |
| 2023-03-29 12:00:00 | 4213.68 | 68.4 | 2.99 |
| 2023-03-29 12:10:00 | 4213.48 | 68.4 | 2.99 |
| 2023-03-29 12:20:00 | 4214.88 | 68.4 | 2.99 |
| 2023-03-29 12:30:00 | 4215.13 | 68.4 | 2.99 |
| 2023-03-29 12:40:00 | 4215.61 | 68.4 | 2.99 |
234 rows × 3 columns
Now for the At Trade dataframe:
pd.options.mode.chained_assignment = None # default='warn'
ATunderlyingMrktPrice['EurRfRate'] = [pd.NA for i in ATunderlyingMrktPrice.index]
for i in _EurRfRate.index:
_i = str(i)[:10]
for n, j in enumerate(ATunderlyingMrktPrice.index):
if _i in str(j):
if len(_EurRfRate.loc[i].values) == 2:
ATunderlyingMrktPrice['EurRfRate'].iloc[n] = _EurRfRate.loc[i].values[0][0]
elif len(_EurRfRate.loc[i].values) == 1:
ATunderlyingMrktPrice['EurRfRate'].iloc[n] = _EurRfRate.loc[i].values[0]
ATdf = ATunderlyingMrktPrice.copy()
Again, you might be running your code after the latest Risk Free Rate published, so the most accurate such value after that would be the latest value, thus the use of ffill:
ATdf = ATdf.fillna(method='ffill')
ATdf.head(2)
| TRDPRC_1_OptPr | TRDPRC_1_UnderlyingPr | EurRfRate | |
|---|---|---|---|
| Timestamp | |||
| 2023-03-23 10:10:00 | 65.7 | 4176.5 | 2.99 |
| 2023-03-23 10:20:00 | 63.6 | 4170.06 | 2.99 |
We are going to assume no dividends.
On the Developer Portal, one can see documentation about the Instrument Pricing Analytics service that allows access to calculating functions (that use to be called 'AdFin'). This service is accessible via several RESTful endpoints (in a family of endpoints called 'Quantitative Analytics') which can be used via RD. However, While we are going to build towards a Class that will put all our concepts together, I 1st want to showcase the several ways in which we can collect the data we're are after, for (i) all trades & (ii) at option trades only (i.e.: not every trade of the underlying) and (a) using the RD delivery layer & (b) the RD content layer:
Data returned this far was time-stamped in the GMT Time Zone, we need to re-calibrate it to the timezone of our machine:
dfGMT = df.copy()
dfLocalTimeZone = df.copy()
dfLocalTimeZone.index = [
df.index[i].replace(
tzinfo=pytz.timezone(
'GMT')).astimezone(
tz=datetime.now().astimezone().tzinfo)
for i in range(len(df))]
dfGMT
| STXE42250D3.EX | underlying .STOXX50E TRDPRC_1 | TRDPRC_1 | EurRfRate |
|---|---|---|---|
| Timestamp | |||
| 2023-03-23 10:10:00 | 4176.5 | 65.7 | 2.99 |
| 2023-03-23 10:20:00 | 4170.06 | 63.6 | 2.99 |
| 2023-03-23 10:30:00 | 4169.96 | 63.6 | 2.99 |
| 2023-03-23 10:40:00 | 4170.47 | 63.6 | 2.99 |
| 2023-03-23 10:50:00 | 4171.68 | 63.6 | 2.99 |
| ... | ... | ... | ... |
| 2023-03-29 12:00:00 | 4213.68 | 68.4 | 2.99 |
| 2023-03-29 12:10:00 | 4213.48 | 68.4 | 2.99 |
| 2023-03-29 12:20:00 | 4214.88 | 68.4 | 2.99 |
| 2023-03-29 12:30:00 | 4215.13 | 68.4 | 2.99 |
| 2023-03-29 12:40:00 | 4215.61 | 68.4 | 2.99 |
234 rows × 3 columns
dfLocalTimeZone
| STXE42250D3.EX | underlying .STOXX50E TRDPRC_1 | TRDPRC_1 | EurRfRate |
|---|---|---|---|
| 2023-03-23 12:10:00+02:00 | 4176.5 | 65.7 | 2.99 |
| 2023-03-23 12:20:00+02:00 | 4170.06 | 63.6 | 2.99 |
| 2023-03-23 12:30:00+02:00 | 4169.96 | 63.6 | 2.99 |
| 2023-03-23 12:40:00+02:00 | 4170.47 | 63.6 | 2.99 |
| 2023-03-23 12:50:00+02:00 | 4171.68 | 63.6 | 2.99 |
| ... | ... | ... | ... |
| 2023-03-29 14:00:00+02:00 | 4213.68 | 68.4 | 2.99 |
| 2023-03-29 14:10:00+02:00 | 4213.48 | 68.4 | 2.99 |
| 2023-03-29 14:20:00+02:00 | 4214.88 | 68.4 | 2.99 |
| 2023-03-29 14:30:00+02:00 | 4215.13 | 68.4 | 2.99 |
| 2023-03-29 14:40:00+02:00 | 4215.61 | 68.4 | 2.99 |
234 rows × 3 columns
requestFields = [
"MarketValueInDealCcy", "RiskFreeRatePercent",
"UnderlyingPrice", "PricingModelType",
"DividendType",
"UnderlyingTimeStamp", "ReportCcy",
"VolatilityType", "Volatility",
"DeltaPercent", "GammaPercent",
"RhoPercent", "ThetaPercent",
"VegaPercent"]
Now for the At Trade dataframe:
universeL = [
{
"instrumentType": "Option",
"instrumentDefinition": {
"buySell": "Buy",
"underlyingType": "Eti",
"instrumentCode": instrument,
"strike": str(strikePrice),
},
"pricingParameters": {
"marketValueInDealCcy": str(dfLocalTimeZone['TRDPRC_1'][i]),
"riskFreeRatePercent": str(dfLocalTimeZone['EurRfRate'][i]),
"underlyingPrice": str(dfLocalTimeZone['underlying ' + underlying + ' TRDPRC_1'][i]),
"pricingModelType": "BlackScholes",
"dividendType": "ImpliedYield",
"volatilityType": "Implied",
"underlyingTimeStamp": "Default",
"reportCcy": "EUR"
}
}
for i in range(len(dfLocalTimeZone.index))]
def Chunks(lst, n):
"""Yield successive n-sized chunks from lst."""
for i in range(0, len(lst), n):
yield lst[i:i + n]
This is the cell, next coming up below, that has a rather high chance of failing. This is because there is no error handling of any kind, just in case there are issues on the servers where we are retreiving data. The COntent Layer functions do have such error handing steps, and therefore is considerably less likely to fail or run into errors.
batchOf = 100
for i, j in enumerate(Chunks(universeL, batchOf)):
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(universeL, batchOf)]))} started")
# Example request with Body Parameter - Symbology Lookup
request_definition = rd.delivery.endpoint_request.Definition(
method=rd.delivery.endpoint_request.RequestMethod.POST,
url='https://api.refinitiv.com/data/quantitative-analytics/v1/financial-contracts',
body_parameters={"fields": requestFields,
"outputs": ["Data", "Headers"],
"universe": j})
response3 = request_definition.get_data()
headers_name = [h['name'] for h in response3.data.raw['headers']]
if i == 0:
response3df = pd.DataFrame(
data=response3.data.raw['data'], columns=headers_name)
else:
_response3df = pd.DataFrame(
data=response3.data.raw['data'], columns=headers_name)
response3df = response3df.append(_response3df, ignore_index=True)
# display(_response3df)
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(universeL, batchOf)]))} ended")
Batch of 100 requests no. 1/3 started Batch of 100 requests no. 1/3 ended Batch of 100 requests no. 2/3 started Batch of 100 requests no. 2/3 ended Batch of 100 requests no. 3/3 started Batch of 100 requests no. 3/3 ended
response3df
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 65.7 | 2.99 | 4176.50 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.650307 | 0.425992 | 0.001809 | 1.079711 | -1.884532 | 4.106245 |
| 1 | 63.6 | 2.99 | 4170.06 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.798941 | 0.415109 | 0.001789 | 1.050709 | -1.883593 | 4.077227 |
| 2 | 63.6 | 2.99 | 4169.96 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.809121 | 0.414981 | 0.001788 | 1.050346 | -1.884306 | 4.076843 |
| 3 | 63.6 | 2.99 | 4170.47 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.757180 | 0.415636 | 0.001793 | 1.052202 | -1.880663 | 4.078800 |
| 4 | 63.6 | 2.99 | 4171.68 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.633718 | 0.417204 | 0.001805 | 1.056641 | -1.871975 | 4.083428 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 229 | 68.4 | 2.99 | 4213.68 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.233868 | 0.488814 | 0.002185 | 1.254797 | -1.626663 | 4.212997 |
| 230 | 68.4 | 2.99 | 4213.48 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.257066 | 0.488415 | 0.002182 | 1.253675 | -1.628624 | 4.212683 |
| 231 | 68.4 | 2.99 | 4214.88 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.094316 | 0.491231 | 0.002203 | 1.261585 | -1.614832 | 4.214797 |
| 232 | 68.4 | 2.99 | 4215.13 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.065163 | 0.491740 | 0.002206 | 1.263013 | -1.612353 | 4.215154 |
| 233 | 68.4 | 2.99 | 4215.61 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.009111 | 0.492721 | 0.002213 | 1.265768 | -1.607577 | 4.215820 |
234 rows × 14 columns
As may (or may not) have been apparent aboe, the delivery layer does not offer any error hendling management. The server where we're requestig for data may be busy, so we may get unsuccessful messages back. You could build error handing logic yourself, but let's not reinvent the wheel when the RD Python Library exists!
dfLocalTimeZone
| STXE42250D3.EX | underlying .STOXX50E TRDPRC_1 | TRDPRC_1 | EurRfRate |
|---|---|---|---|
| 2023-03-23 12:10:00+02:00 | 4176.5 | 65.7 | 2.99 |
| 2023-03-23 12:20:00+02:00 | 4170.06 | 63.6 | 2.99 |
| 2023-03-23 12:30:00+02:00 | 4169.96 | 63.6 | 2.99 |
| 2023-03-23 12:40:00+02:00 | 4170.47 | 63.6 | 2.99 |
| 2023-03-23 12:50:00+02:00 | 4171.68 | 63.6 | 2.99 |
| ... | ... | ... | ... |
| 2023-03-29 14:00:00+02:00 | 4213.68 | 68.4 | 2.99 |
| 2023-03-29 14:10:00+02:00 | 4213.48 | 68.4 | 2.99 |
| 2023-03-29 14:20:00+02:00 | 4214.88 | 68.4 | 2.99 |
| 2023-03-29 14:30:00+02:00 | 4215.13 | 68.4 | 2.99 |
| 2023-03-29 14:40:00+02:00 | 4215.61 | 68.4 | 2.99 |
234 rows × 3 columns
CuniverseL = [ # C here is for the fact that we're using the content layer
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code=instrument,
strike=float(strikePrice),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(dfLocalTimeZone['TRDPRC_1'][i]),
risk_free_rate_percent=float(dfLocalTimeZone['EurRfRate'][i]),
underlying_price=float(dfLocalTimeZone[
'underlying ' + underlying + ' TRDPRC_1'][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='EUR'))
for i in range(len(dfLocalTimeZone.index))]
batchOf = 100
for i, j in enumerate(Chunks(CuniverseL, batchOf)):
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(CuniverseL, 100)])} started")
# Example request with Body Parameter - Symbology Lookup
response4 = rdf.Definitions(universe=j, fields=requestFields)
response4 = response4.get_data()
if i == 0:
response4df = response4.data.df
else:
response4df = response4df.append(response4.data.df, ignore_index=True)
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(CuniverseL, 100)])} ended")
Batch of 100 requests no. 1/3 started Batch of 100 requests no. 1/3 ended Batch of 100 requests no. 2/3 started Batch of 100 requests no. 2/3 ended Batch of 34 requests no. 3/3 started Batch of 34 requests no. 3/3 ended
IPADf = response4df.copy() # IPA here stands for the service we used to get all the calculated valuse, Instrument Pricint Analitycs.
IPADf.index = dfLocalTimeZone.index
IPADf.columns.name = dfLocalTimeZone.columns.name
IPADf.rename(columns={"Volatility": 'ImpliedVolatility'}, inplace=True)
IPADf
| STXE42250D3.EX | MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | ImpliedVolatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2023-03-23 12:10:00+02:00 | 65.7 | 2.99 | 4176.5 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.650307 | 0.425992 | 0.001809 | 1.079711 | -1.884532 | 4.106245 |
| 2023-03-23 12:20:00+02:00 | 63.6 | 2.99 | 4170.06 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.798941 | 0.415109 | 0.001789 | 1.050709 | -1.883593 | 4.077227 |
| 2023-03-23 12:30:00+02:00 | 63.6 | 2.99 | 4169.96 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.809121 | 0.414981 | 0.001788 | 1.050346 | -1.884306 | 4.076843 |
| 2023-03-23 12:40:00+02:00 | 63.6 | 2.99 | 4170.47 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.75718 | 0.415636 | 0.001793 | 1.052202 | -1.880663 | 4.0788 |
| 2023-03-23 12:50:00+02:00 | 63.6 | 2.99 | 4171.68 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.633718 | 0.417204 | 0.001805 | 1.056641 | -1.871975 | 4.083428 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-03-29 14:00:00+02:00 | 68.4 | 2.99 | 4213.68 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.233868 | 0.488814 | 0.002185 | 1.254797 | -1.626663 | 4.212997 |
| 2023-03-29 14:10:00+02:00 | 68.4 | 2.99 | 4213.48 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.257066 | 0.488415 | 0.002182 | 1.253675 | -1.628624 | 4.212683 |
| 2023-03-29 14:20:00+02:00 | 68.4 | 2.99 | 4214.88 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.094316 | 0.491231 | 0.002203 | 1.261585 | -1.614832 | 4.214797 |
| 2023-03-29 14:30:00+02:00 | 68.4 | 2.99 | 4215.13 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.065163 | 0.49174 | 0.002206 | 1.263013 | -1.612353 | 4.215154 |
| 2023-03-29 14:40:00+02:00 | 68.4 | 2.99 | 4215.61 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 17.009111 | 0.492721 | 0.002213 | 1.265768 | -1.607577 | 4.21582 |
234 rows × 14 columns
ATdfGMT = ATdf.copy()
ATdfLocalTimeZone = ATdf.copy()
ATdfLocalTimeZone.index = [
ATdf.index[i].replace(
tzinfo=pytz.timezone(
'GMT')).astimezone(
tz=datetime.now().astimezone().tzinfo)
for i in range(len(ATdf))]
ATdfGMT
| TRDPRC_1_OptPr | TRDPRC_1_UnderlyingPr | EurRfRate | |
|---|---|---|---|
| Timestamp | |||
| 2023-03-23 10:10:00 | 65.7 | 4176.5 | 2.990 |
| 2023-03-23 10:20:00 | 63.6 | 4170.06 | 2.990 |
| 2023-03-23 10:30:00 | 63.6 | 4169.96 | 2.990 |
| 2023-03-23 11:10:00 | 67.6 | 4184.29 | 2.990 |
| 2023-03-23 13:00:00 | 70.0 | 4186.71 | 2.990 |
| 2023-03-23 13:40:00 | 68.6 | 4191.93 | 2.990 |
| 2023-03-23 13:50:00 | 69.6 | 4197.17 | 2.990 |
| 2023-03-23 14:40:00 | 73.2 | 4206.26 | 2.990 |
| 2023-03-23 15:20:00 | 78.0 | 4207.85 | 2.990 |
| 2023-03-23 16:10:00 | 76.0 | 4205.64 | 2.990 |
| 2023-03-23 16:20:00 | 75.2 | 4205.84 | 2.990 |
| 2023-03-24 08:00:00 | 67.0 | 4155.74 | 3.025 |
| 2023-03-24 09:00:00 | 58.5 | 4153.61 | 3.025 |
| 2023-03-24 09:20:00 | 56.7 | 4142.65 | 3.025 |
| 2023-03-24 10:20:00 | 55.2 | 4119.19 | 3.025 |
| 2023-03-24 10:30:00 | 51.3 | 4114.7 | 3.025 |
| 2023-03-24 11:20:00 | 49.6 | 4110.14 | 3.025 |
| 2023-03-24 12:10:00 | 48.0 | 4104.0 | 3.025 |
| 2023-03-24 12:20:00 | 48.0 | 4111.43 | 3.025 |
| 2023-03-24 12:40:00 | 46.6 | 4110.48 | 3.025 |
| 2023-03-24 13:40:00 | 49.5 | 4121.7 | 3.025 |
| 2023-03-24 13:50:00 | 50.6 | 4134.15 | 3.025 |
| 2023-03-24 14:00:00 | 54.0 | 4126.4 | 3.025 |
| 2023-03-24 14:30:00 | 48.7 | 4124.55 | 3.025 |
| 2023-03-24 15:20:00 | 46.8 | 4118.08 | 3.025 |
| 2023-03-24 15:40:00 | 49.5 | 4129.03 | 3.025 |
| 2023-03-24 15:50:00 | 49.8 | 4136.98 | 3.025 |
| 2023-03-24 16:00:00 | 51.2 | 4132.09 | 3.025 |
| 2023-03-24 16:20:00 | 47.2 | 4124.25 | 3.025 |
| 2023-03-27 07:20:00 | 59.0 | 4166.37 | 3.012 |
| 2023-03-27 07:50:00 | 55.7 | 4147.46 | 3.012 |
| 2023-03-27 08:00:00 | 53.4 | 4154.24 | 3.012 |
| 2023-03-27 09:50:00 | 58.2 | 4166.21 | 3.012 |
| 2023-03-27 10:00:00 | 60.5 | 4172.21 | 3.012 |
| 2023-03-27 10:20:00 | 61.1 | 4173.36 | 3.012 |
| 2023-03-28 07:40:00 | 63.6 | 4189.97 | 2.990 |
| 2023-03-28 08:20:00 | 60.0 | 4181.98 | 2.990 |
| 2023-03-28 08:40:00 | 57.4 | 4176.89 | 2.990 |
| 2023-03-28 08:50:00 | 58.0 | 4176.46 | 2.990 |
| 2023-03-28 11:10:00 | 55.5 | 4170.95 | 2.990 |
| 2023-03-28 11:20:00 | 55.5 | 4174.9 | 2.990 |
| 2023-03-28 13:50:00 | 50.5 | 4157.18 | 2.990 |
| 2023-03-28 14:20:00 | 49.5 | 4168.05 | 2.990 |
| 2023-03-28 14:30:00 | 49.0 | 4161.55 | 2.990 |
| 2023-03-28 15:20:00 | 51.1 | 4170.0 | 2.990 |
| 2023-03-29 09:50:00 | 70.1 | 4215.09 | 3.015 |
| 2023-03-29 10:10:00 | 65.9 | 4208.04 | 3.015 |
| 2023-03-29 11:30:00 | 68.4 | 4216.29 | 3.015 |
ATdfLocalTimeZone
| TRDPRC_1_OptPr | TRDPRC_1_UnderlyingPr | EurRfRate | |
|---|---|---|---|
| 2023-03-23 12:10:00+02:00 | 65.7 | 4176.5 | 2.990 |
| 2023-03-23 12:20:00+02:00 | 63.6 | 4170.06 | 2.990 |
| 2023-03-23 12:30:00+02:00 | 63.6 | 4169.96 | 2.990 |
| 2023-03-23 13:10:00+02:00 | 67.6 | 4184.29 | 2.990 |
| 2023-03-23 15:00:00+02:00 | 70.0 | 4186.71 | 2.990 |
| 2023-03-23 15:40:00+02:00 | 68.6 | 4191.93 | 2.990 |
| 2023-03-23 15:50:00+02:00 | 69.6 | 4197.17 | 2.990 |
| 2023-03-23 16:40:00+02:00 | 73.2 | 4206.26 | 2.990 |
| 2023-03-23 17:20:00+02:00 | 78.0 | 4207.85 | 2.990 |
| 2023-03-23 18:10:00+02:00 | 76.0 | 4205.64 | 2.990 |
| 2023-03-23 18:20:00+02:00 | 75.2 | 4205.84 | 2.990 |
| 2023-03-24 10:00:00+02:00 | 67.0 | 4155.74 | 3.025 |
| 2023-03-24 11:00:00+02:00 | 58.5 | 4153.61 | 3.025 |
| 2023-03-24 11:20:00+02:00 | 56.7 | 4142.65 | 3.025 |
| 2023-03-24 12:20:00+02:00 | 55.2 | 4119.19 | 3.025 |
| 2023-03-24 12:30:00+02:00 | 51.3 | 4114.7 | 3.025 |
| 2023-03-24 13:20:00+02:00 | 49.6 | 4110.14 | 3.025 |
| 2023-03-24 14:10:00+02:00 | 48.0 | 4104.0 | 3.025 |
| 2023-03-24 14:20:00+02:00 | 48.0 | 4111.43 | 3.025 |
| 2023-03-24 14:40:00+02:00 | 46.6 | 4110.48 | 3.025 |
| 2023-03-24 15:40:00+02:00 | 49.5 | 4121.7 | 3.025 |
| 2023-03-24 15:50:00+02:00 | 50.6 | 4134.15 | 3.025 |
| 2023-03-24 16:00:00+02:00 | 54.0 | 4126.4 | 3.025 |
| 2023-03-24 16:30:00+02:00 | 48.7 | 4124.55 | 3.025 |
| 2023-03-24 17:20:00+02:00 | 46.8 | 4118.08 | 3.025 |
| 2023-03-24 17:40:00+02:00 | 49.5 | 4129.03 | 3.025 |
| 2023-03-24 17:50:00+02:00 | 49.8 | 4136.98 | 3.025 |
| 2023-03-24 18:00:00+02:00 | 51.2 | 4132.09 | 3.025 |
| 2023-03-24 18:20:00+02:00 | 47.2 | 4124.25 | 3.025 |
| 2023-03-27 09:20:00+02:00 | 59.0 | 4166.37 | 3.012 |
| 2023-03-27 09:50:00+02:00 | 55.7 | 4147.46 | 3.012 |
| 2023-03-27 10:00:00+02:00 | 53.4 | 4154.24 | 3.012 |
| 2023-03-27 11:50:00+02:00 | 58.2 | 4166.21 | 3.012 |
| 2023-03-27 12:00:00+02:00 | 60.5 | 4172.21 | 3.012 |
| 2023-03-27 12:20:00+02:00 | 61.1 | 4173.36 | 3.012 |
| 2023-03-28 09:40:00+02:00 | 63.6 | 4189.97 | 2.990 |
| 2023-03-28 10:20:00+02:00 | 60.0 | 4181.98 | 2.990 |
| 2023-03-28 10:40:00+02:00 | 57.4 | 4176.89 | 2.990 |
| 2023-03-28 10:50:00+02:00 | 58.0 | 4176.46 | 2.990 |
| 2023-03-28 13:10:00+02:00 | 55.5 | 4170.95 | 2.990 |
| 2023-03-28 13:20:00+02:00 | 55.5 | 4174.9 | 2.990 |
| 2023-03-28 15:50:00+02:00 | 50.5 | 4157.18 | 2.990 |
| 2023-03-28 16:20:00+02:00 | 49.5 | 4168.05 | 2.990 |
| 2023-03-28 16:30:00+02:00 | 49.0 | 4161.55 | 2.990 |
| 2023-03-28 17:20:00+02:00 | 51.1 | 4170.0 | 2.990 |
| 2023-03-29 11:50:00+02:00 | 70.1 | 4215.09 | 3.015 |
| 2023-03-29 12:10:00+02:00 | 65.9 | 4208.04 | 3.015 |
| 2023-03-29 13:30:00+02:00 | 68.4 | 4216.29 | 3.015 |
ATuniverseL = [
{
"instrumentType": "Option",
"instrumentDefinition": {
"buySell": "Buy",
"underlyingType": "Eti",
"instrumentCode": instrument,
"strike": str(strikePrice),
},
"pricingParameters": {
"marketValueInDealCcy": str(ATdfLocalTimeZone['TRDPRC_1_OptPr'][i]),
"riskFreeRatePercent": str(ATdfLocalTimeZone['EurRfRate'][i]),
"underlyingPrice": str(ATdfLocalTimeZone['TRDPRC_1_UnderlyingPr'][i]),
"pricingModelType": "BlackScholes",
"dividendType": "ImpliedYield",
"volatilityType": "Implied",
"underlyingTimeStamp": "Default",
"reportCcy": "EUR"
}
}
for i in range(len(ATdfLocalTimeZone.index))]
ATCUniverseL = [ # C here is for the fact that we're using the content layer
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code=instrument,
strike=float(strikePrice),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(ATdfLocalTimeZone['TRDPRC_1_OptPr'][i]),
risk_free_rate_percent=float(ATdfLocalTimeZone['EurRfRate'][i]),
underlying_price=float(ATdfLocalTimeZone['TRDPRC_1_UnderlyingPr'][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='EUR'))
for i in range(len(ATdfLocalTimeZone.index))]
batchOf = 100
for i, j in enumerate(Chunks(ATCUniverseL, batchOf)):
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(ATCUniverseL, batchOf)])} started")
# Example request with Body Parameter - Symbology Lookup
response5 = rdf.Definitions(
universe=j,
fields=requestFields)
response5 = response5.get_data()
if i == 0:
response5df = response5.data.df
else:
response5df = response5df.append(response5.data.df, ignore_index=True)
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(ATCUniverseL, batchOf)])} ended")
Batch of 48 requests no. 1/1 started Batch of 48 requests no. 1/1 ended
response5df.head(2)
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 65.7 | 2.99 | 4176.5 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.650307 | 0.425992 | 0.001809 | 1.079711 | -1.884532 | 4.106245 |
| 1 | 63.6 | 2.99 | 4170.06 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.798941 | 0.415109 | 0.001789 | 1.050709 | -1.883593 | 4.077227 |
ATIPADf = response5df.copy() # IPA here stands for the service we used to get all the calculated valuse, Instrument Pricint Analitycs.
ATIPADf.index = ATdfLocalTimeZone.index
ATIPADf.columns.name = ATdfLocalTimeZone.columns.name
ATIPADf.rename(columns={"Volatility": 'ImpliedVolatility'}, inplace=True)
ATIPADf.head(2)
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | ImpliedVolatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2023-03-23 12:10:00+02:00 | 65.7 | 2.99 | 4176.5 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.650307 | 0.425992 | 0.001809 | 1.079711 | -1.884532 | 4.106245 |
| 2023-03-23 12:20:00+02:00 | 63.6 | 2.99 | 4170.06 | BlackScholes | ImpliedYield | Default | EUR | Calculated | 20.798941 | 0.415109 | 0.001789 | 1.050709 | -1.883593 | 4.077227 |
display(searchDf2.iloc[
(searchDf2['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]])
IPADfGraph = IPADf[['ImpliedVolatility', 'MarketValueInDealCcy',
'RiskFreeRatePercent', 'UnderlyingPrice', 'DeltaPercent',
'GammaPercent', 'RhoPercent', 'ThetaPercent', 'VegaPercent']]
fig = px.line(IPADfGraph) # This is just to see the implied vol graph when that field is available
# fig.layout = dict(xaxis=dict(type="category"))
# Format Graph: https://plotly.com/python/tick-formatting/
fig.update_layout(
title=instrument,
template='plotly_dark')
# Make it so that only one line is shown by default: # https://stackoverflow.com/questions/73384807/plotly-express-plot-subset-of-dataframe-columns-by-default-and-the-rest-as-opt
fig.for_each_trace(
lambda t: t.update(
visible=True if t.name in IPADfGraph.columns[:1] else "legendonly"))
# fig.update_xaxes(autorange=True)
# fig.update_layout(yaxis=IPADf.index[0::10])
fig.show()
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 13 | Eurex EURO STOXX 50 Index Option 4225 Call Apr... | STXE42250D3.EX | 4225 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:50:17 | 2023-04-25 |
This representation will allow us to see several graphs at different scales stacked above one another. This way, we can see if the change in Implied Volatility is caused by a movement in the underlying or the Option price itself:
fig = subplots.make_subplots(rows=3, cols=1)
fig.add_trace(go.Scatter(x=IPADf.index, y=IPADf.ImpliedVolatility, name='Op Imp Volatility'), row=1, col=1)
fig.add_trace(go.Scatter(x=IPADf.index, y=IPADf.MarketValueInDealCcy, name='Op Mk Pr'), row=2, col=1)
fig.add_trace(go.Scatter(x=IPADf.index, y=IPADf.UnderlyingPrice, name=underlying + ' Undrlyg Pr'), row=3, col=1)
fig.update(layout_xaxis_rangeslider_visible=False)
fig.update_layout(title=IPADf.columns.name)
fig.update_layout(
template='plotly_dark',
autosize=False,
width=1300,
height=500)
fig.show()
searchDf2.iloc[(searchDf2['StrikePrice']-currentUnderlyingPrc).abs().argsort()[:1]]
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 13 | Eurex EURO STOXX 50 Index Option 4225 Call Apr... | STXE42250D3.EX | 4225 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:50:17 | 2023-04-25 |
Certain companies are slow to update libraries, dependencies or Python versions. They/You may thus not have access to plotly (the graph library we used above). Matplotlib is rather light and should work, even on machines with old setups:
display(searchDf2.iloc[(searchDf2.StrikePrice-currentUnderlyingPrc).abs().argsort()[:1]])
ATIPADfSimpleGraph = pd.DataFrame(
data=ATIPADf.ImpliedVolatility.values, index=ATIPADf.ImpliedVolatility.index)
fig, ax = plt.subplots(ncols=1)
ax.plot(ATIPADfSimpleGraph, '.-')
# ax.xaxis.set_major_formatter(ticker.FuncFormatter(format_date))
ax.set_title(f"{searchDf2.iloc[(searchDf2.StrikePrice-currentUnderlyingPrc).abs().argsort()[:1]].RIC.values[0]} Implied Volatility At Trade Only")
fig.autofmt_xdate()
plt.show()
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 13 | Eurex EURO STOXX 50 Index Option 4225 Call Apr... | STXE42250D3.EX | 4225 | EUX | 2023-04-21 | [.STOXX50E] | 2023-03-09 03:50:17 | 2023-04-25 |
Note here that we are looking only 'At Trade', i.e.: times when the option traded, not the underlying. There are therefore fewer datapoints.
Let's put it all together into a single function. This ImpVolatilityCalcIPA function will allow anyone to:
(I) find the option (i) with the index of your choice (SPX or EUREX) as underlying, (ii) closest to strike price right now (i.e.: At The Money) and (iii) with the next, closest expiry date past x days after today,
(II) calculate the Implied Volatility for that option either (i) only at times when the option itself is traded or (ii) at any time the option or the underlying is being traded.
def ImpVolatilityCalcIPA(x=15,
instrument=None,
indexUnderlying=".STOXX50E",
callOrPut='Put',
dateBack=3,
expiryYearOfInterest=datetime.now().year,
riskFreeRate=None, riskFreeRateField=None,
timeZoneInGraph=datetime.now().astimezone(),
maxColwidth=200,
graphStyle='overlay', # 'overlay', '3 graphs', 'simple'
simpleGraphLineStyle='.-', # 'o-'
simpleGraphSize=(15, 5),
graphTemplate='plotly_dark',
debug=False,
returnDfGraph=False,
AtOptionTradeOnly=False):
if indexUnderlying == ".STOXX50E":
exchangeC, exchangeRIC, mcalGetCalendar = 'EUX', 'STX', 'EUREX'
elif indexUnderlying == '.SPX':
exchangeC, exchangeRIC, mcalGetCalendar = 'OPQ', 'SPX', 'CBOE_Futures' # 'CBOE_Index_Options' # should be 'CBOE_Index_Options'... CBOT_Equity
def Get_exp_dates(year=expiryYearOfInterest,
days=True,
mcal_get_calendar=mcalGetCalendar):
'''
Get_exp_dates Version 3.0:
This function gets expiration dates for a year for NDX options, which are the 3rd Fridays of each month.
Changes
----------------------------------------------
Changed from Version 1.0 to 2.0: Jonathan Legrand changed Haykaz Aramyan's original code to allow
(i) for the function's holiday argument to be changed, and defaulted to 'EUREX' as opposed to 'CBOE_Index_Options' and
(ii) for the function to output full date objects as opposed to just days of the month if agument days=True.
Changed from Version 2.0 to 3.0: Jonathan Legrand changed this function to reflec tthe fact that it can be used for indexes other than EUREX.
Dependencies
----------------------------------------------
Python library 'pandas_market_calendars' version 3.2
Parameters
-----------------------------------------------
Input:
year(int): year for which expiration days are requested
mcal_get_calendar(str):
String of the calendar for which holidays have to be taken into account.
More on this calendar (link to Github checked 2022-10-11): https://github.com/rsheftel/pandas_market_calendars/blob/177e7922c7df5ad249b0d066b5c9e730a3ee8596/pandas_market_calendars/exchange_calendar_cboe.py
Default: mcal_get_calendar='EUREX'
days(bool): If True, only days of the month is outputed, else it's dataeime objects
Default: days=True
Output:
dates(dict): dictionary of expiration days for each month of a specified year in datetime.date format.
'''
# get CBOE market holidays
Cal = mcal.get_calendar(mcal_get_calendar)
holidays = Cal.holidays().holidays
# set calendar starting from Saturday
c = calendar.Calendar(firstweekday=calendar.SATURDAY)
# get the 3rd Friday of each month
exp_dates = {}
for i in range(1, 13):
monthcal = c.monthdatescalendar(year, i)
date = monthcal[2][-1]
# check if found date is an holiday and get the previous date if it is
if date in holidays:
date = date + timedelta(-1)
# append the date to the dictionary
if year in exp_dates:
### Changed from original code from here on by Jonathan Legrand on 2022-10-11
if days: exp_dates[year].append(date.day)
else: exp_dates[year].append(date)
else:
if days: exp_dates[year] = [date.day]
else: exp_dates[year] = [date]
return exp_dates
timeOfCalcDatetime = datetime.now() # For now, we will focuss on the use-case where we are calculating values for today; later we will allow for it historically for any day going back a few business days.
timeOfCalcStr = datetime.now().strftime('%Y-%m-%d')
fullDatesAtTimeOfCalc = Get_exp_dates(timeOfCalcDatetime.year, days=False) # `timeOfCalcDatetime.year` here is 2023
fullDatesAtTimeOfCalcDatetime = [
datetime(i.year, i.month, i.day)
for i in fullDatesAtTimeOfCalc[list(fullDatesAtTimeOfCalc.keys())[0]]]
expiryDateOfInt = [i for i in fullDatesAtTimeOfCalcDatetime
if i > timeOfCalcDatetime + relativedelta(days=x)][0]
if debug: print(f"expiryDateOfInt: {expiryDateOfInt}")
response = search.Definition(
view = search.Views.SEARCH_ALL, # To see what views are available: `help(search.Views)` & `search.metadata.Definition(view = search.Views.SEARCH_ALL).get_data().data.df.to_excel("SEARCH_ALL.xlsx")`
query=indexUnderlying,
select="DocumentTitle, RIC, StrikePrice, ExchangeCode, ExpiryDate, UnderlyingAsset, " +
"UnderlyingAssetName, UnderlyingAssetRIC, ESMAUnderlyingIndexCode, RCSUnderlyingMarket" +
"UnderlyingQuoteName, UnderlyingQuoteRIC, InsertDateTime, RetireDate",
filter=f"RCSAssetCategoryLeaf eq 'Option' and RIC eq '{exchangeRIC}*' and DocumentTitle ne '*Weekly*' " +
f"and CallPutOption eq '{callOrPut}' and ExchangeCode eq '{exchangeC}' and " +
f"ExpiryDate ge {(expiryDateOfInt - relativedelta(days=1)).strftime('%Y-%m-%d')} " +
f"and ExpiryDate lt {(expiryDateOfInt + relativedelta(days=1)).strftime('%Y-%m-%d')}", # ge (greater than or equal to), gt (greater than), lt (less than) and le (less than or equal to). These can only be applied to numeric and date properties.
top=10000,
).get_data()
searchDf = response.data.df
if debug: display(searchDf)
try:
underlyingPrice = rd.get_history(
universe=[indexUnderlying],
fields=["TRDPRC_1"],
interval="tick").iloc[-1][0]
except:
print("Function failed at the search strage, returning the following dataframe: ")
display(searchDf)
if debug:
print(f"Underlying {indexUnderlying}'s price recoprded here was {underlyingPrice}")
display(searchDf.iloc[(searchDf['StrikePrice']-underlyingPrice).abs().argsort()[:10]])
if instrument is None:
instrument = searchDf.iloc[(searchDf['StrikePrice']-underlyingPrice).abs().argsort()[:1]].RIC.values[0]
start = (timeOfCalcDatetime - pd.tseries.offsets.BDay(dateBack)).strftime('%Y-%m-%dT%H:%M:%S.%f') # '2022-10-05T07:30:00.000'
endDateTime = datetime.now()
end = endDateTime.strftime('%Y-%m-%dT%H:%M:%S.%f') # e.g.: '2022-09-09T20:00:00.000'
_optnMrktPrice = rd.get_history(
universe=[instrument],
fields=["TRDPRC_1"],
interval="10min",
start=start, # Ought to always start at 4 am for OPRA exchanged Options, more info in the article below
end=end) # Ought to always end at 8 pm for OPRA exchanged Options, more info in the article below
if debug:
print(instrument)
display(_optnMrktPrice)
## Data on certain options are stale and do not nessesarily show up on Workspace, in case that happens, we will pick the next ATM Option, which probably will have the same strike, but we will only do so once, any more and we could get too far from strike:
if _optnMrktPrice.empty:
if debug: print(f"No data could be found for {instrument}, so the next ATM Option was chosen")
instrument = searchDf.iloc[(searchDf['StrikePrice']-underlyingPrice).abs().argsort()[1:2]].RIC.values[0]
if debug: print(f"{instrument}")
_optnMrktPrice = rd.get_history(universe=[instrument],
fields=["TRDPRC_1"], interval="10min",
start=start, end=end)
if debug: display(_optnMrktPrice)
if _optnMrktPrice.empty: # Let's try one more time, as is often nessesary
if debug: print(f"No data could be found for {instrument}, so the next ATM Option was chosen")
instrument = searchDf.iloc[(searchDf['StrikePrice']-underlyingPrice).abs().argsort()[2:3]].RIC.values[0]
if debug: print(f"{instrument}")
_optnMrktPrice = rd.get_history(universe=[instrument],
fields=["TRDPRC_1"], interval="10min",
start=start, end=end)
if debug: display(_optnMrktPrice)
if _optnMrktPrice.empty:
print(f"No data could be found for {instrument}, please check it on Refinitiv Workspace")
optnMrktPrice = _optnMrktPrice.resample('10Min').mean() # get a datapoint every 10 min
optnMrktPrice = optnMrktPrice[optnMrktPrice.index.strftime('%Y-%m-%d').isin([i for i in _optnMrktPrice.index.strftime('%Y-%m-%d').unique()])] # Only keep trading days
optnMrktPrice = optnMrktPrice.loc[(optnMrktPrice.index.strftime('%H:%M:%S') >= '07:30:00') & (optnMrktPrice.index.strftime('%H:%M:%S') <= '22:00:00')] # Only keep trading hours
optnMrktPrice.fillna(method='ffill', inplace=True) # Forward Fill to populate NaN values
# Note also that one may want to only look at 'At Option Trade' datapoints,
# i.e.: Implied Volatility when a trade is made for the Option, but not when
# none is made. For this, we will use the 'At Trade' (`AT`) dataframes:
if AtOptionTradeOnly: AToptnMrktPrice = _optnMrktPrice
underlying = searchDf.iloc[(searchDf.StrikePrice).abs().argsort()[:1]].UnderlyingQuoteRIC.values[0][0]
_underlyingMrktPrice = rd.get_history(
universe=[underlying],
fields=["TRDPRC_1"],
interval="10min",
start=start,
end=end)
# Let's put it al in one data-frame, `df`. Some datasets will have data
# going from the time we sert for `start` all the way to `end`. Some won't
# because no trade happened in the past few minutes/hours. We ought to base
# ourselves on the dataset with values getting closer to `end` and `ffill`
# for the other column. As a result, the following `if` loop is needed:
if optnMrktPrice.index[-1] >= _underlyingMrktPrice.index[-1]:
df = optnMrktPrice.copy()
df[f"underlying {underlying} TRDPRC_1"] = _underlyingMrktPrice
else:
df = _underlyingMrktPrice.copy()
df.rename(
columns={"TRDPRC_1": f"underlying {underlying} TRDPRC_1"},
inplace=True)
df['TRDPRC_1'] = optnMrktPrice
df.columns.name = optnMrktPrice.columns.name
df.fillna(method='ffill', inplace=True) # Forward Fill to populate NaN values
df = df.dropna()
if AtOptionTradeOnly:
ATunderlyingMrktPrice = AToptnMrktPrice.join(
_underlyingMrktPrice, lsuffix='_OptPr', how='inner',
rsuffix=f" Underlying {underlying} TRDPRC_1")
ATdf = ATunderlyingMrktPrice
strikePrice = searchDf.iloc[(searchDf['StrikePrice']-underlyingPrice).abs().argsort()[:1]].StrikePrice.values[0]
if riskFreeRate is None and indexUnderlying == ".SPX":
_riskFreeRate = 'USDCFCFCTSA3M='
_riskFreeRateField = 'TR.FIXINGVALUE'
elif riskFreeRate is None and indexUnderlying == ".STOXX50E":
_riskFreeRate = 'EURIBOR3MD='
_riskFreeRateField = 'TR.FIXINGVALUE'
else:
_riskFreeRate, _riskFreeRateField = riskFreeRate, riskFreeRateField
_RfRate = rd.get_history(
universe=[_riskFreeRate], # USD3MFSR=, USDSOFR=
fields=[_riskFreeRateField],
# Since we will use `dropna()` as a way to select the rows we are after later on in the code, we need to ask for more risk-free data than needed, just in case we don't have enough:
start=(datetime.strptime(start, '%Y-%m-%dT%H:%M:%S.%f') - timedelta(days=1)).strftime('%Y-%m-%d'),
end=(datetime.strptime(end, '%Y-%m-%dT%H:%M:%S.%f') + timedelta(days=1)).strftime('%Y-%m-%d'))
RfRate = _RfRate.resample('10Min').mean().fillna(method='ffill')
if AtOptionTradeOnly:
pd.options.mode.chained_assignment = None # default='warn'
ATunderlyingMrktPrice['RfRate'] = [pd.NA for i in ATunderlyingMrktPrice.index]
for i in RfRate.index:
_i = str(i)[:10]
for n, j in enumerate(ATunderlyingMrktPrice.index):
if _i in str(j):
if len(RfRate.loc[i].values) == 2:
ATunderlyingMrktPrice['RfRate'].iloc[n] = RfRate.loc[i].values[0][0]
elif len(RfRate.loc[i].values) == 1:
ATunderlyingMrktPrice['RfRate'].iloc[n] = RfRate.loc[i].values[0]
ATdf = ATunderlyingMrktPrice.copy()
ATdf = ATdf.fillna(method='ffill') # This is in case there were no Risk Free datapoints released after a certain time, but trades on the option still went through.
else:
df['RfRate'] = RfRate
df = df.fillna(method='ffill')
if timeZoneInGraph != 'GMT' and AtOptionTradeOnly:
ATdf.index = [
ATdf.index[i].replace(
tzinfo=pytz.timezone(
'GMT')).astimezone(
tz=datetime.now().astimezone().tzinfo)
for i in range(len(ATdf))]
elif timeZoneInGraph != 'GMT':
df.index = [
df.index[i].replace(
tzinfo=pytz.timezone(
'GMT')).astimezone(
tz=timeZoneInGraph.tzinfo)
for i in range(len(df))]
if AtOptionTradeOnly:
if debug:
print("ATdf")
display(ATdf)
universeL = [
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code=instrument,
strike=float(strikePrice),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(ATdf.TRDPRC_1_OptPr[i]),
risk_free_rate_percent=float(ATdf.RfRate[i]),
underlying_price=float(ATdf[
f"TRDPRC_1 Underlying {underlying} TRDPRC_1"][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='EUR'))
for i in range(len(ATdf.index))]
else:
if debug:
print("df")
display(df)
universeL = [ # C here is for the fact that we're using the content layer
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code=instrument,
strike=float(strikePrice),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(df.TRDPRC_1[i]),
risk_free_rate_percent=float(df.RfRate[i]),
underlying_price=float(df[
f"underlying {underlying} TRDPRC_1"][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='EUR'))
for i in range(len(df.index))]
def Chunks(lst, n):
"""Yield successive n-sized chunks from lst."""
for i in range(0, len(lst), n):
yield lst[i:i + n]
requestFields = [
"MarketValueInDealCcy", "RiskFreeRatePercent",
"UnderlyingPrice", "PricingModelType",
"DividendType", "UnderlyingTimeStamp",
"ReportCcy", "VolatilityType",
"Volatility", "DeltaPercent", "GammaPercent",
"RhoPercent", "ThetaPercent", "VegaPercent"]
batchOf = 100
for i, j in enumerate(Chunks(universeL, batchOf)):
if debug: print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(universeL, batchOf)])} started")
# Example request with Body Parameter - Symbology Lookup
request_definition = rdf.Definitions(universe=j, fields=requestFields)
response = request_definition.get_data()
if i == 0:
IPADf = response.data.df
else:
IPADf = IPADf.append(response.data.df, ignore_index=True)
if debug: print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(universeL, batchOf)])} ended")
if AtOptionTradeOnly:
IPADf.index = ATdf.index
IPADf.columns.name = ATdf.columns.name
else:
IPADf.index = df.index
IPADf.columns.name = df.columns.name
IPADf.rename(columns={"Volatility": 'ImpliedVolatility'}, inplace=True)
# We are going to want to show details about data retreived in a dataframe in the output of this function. The one line below allows us to maximise the width (column) length of cells to see all that is written within them.
pd.options.display.max_colwidth = maxColwidth
if graphStyle == 'simple':
display(searchDf.iloc[(searchDf.StrikePrice-underlyingPrice).abs().argsort()[:1]])
fig, axes = plt.subplots(ncols=1, figsize=simpleGraphSize)
axes.plot(
pd.DataFrame( # Unfortunutally, Matplotlib, which is the library used here for simple graphs, require our dataframe to be in a specific format that necessitate the use of `pd.DataFrame`
data=IPADf[['ImpliedVolatility']].ImpliedVolatility.values,
index=IPADf[['ImpliedVolatility']].ImpliedVolatility.index),
simpleGraphLineStyle)
if AtOptionTradeOnly: axes.set_title(f"{instrument} Implied Volatility At Trade Only")
else: axes.set_title(f"{instrument} Implied Volatility")
plt.show()
else:
display(searchDf.iloc[(searchDf['StrikePrice']-underlyingPrice).abs().argsort()[:1]])
IPADfGraph = IPADf[
['ImpliedVolatility', 'MarketValueInDealCcy',
'RiskFreeRatePercent', 'UnderlyingPrice', 'DeltaPercent',
'GammaPercent', 'RhoPercent', 'ThetaPercent', 'VegaPercent']]
if debug: display(IPADfGraph)
try: # This is needed in case there is not enough data to calculate values for all timestamps , see https://stackoverflow.com/questions/67244912/wide-format-csv-with-plotly-express
fig = px.line(IPADfGraph)
except:
if returnDfGraph:
return IPADfGraph
else:
IPADfGraph = IPADfGraph[
["ImpliedVolatility", "MarketValueInDealCcy",
"RiskFreeRatePercent", "UnderlyingPrice"]]
fig = px.line(IPADfGraph)
if graphStyle == 'overlay':
fig.update_layout(
title=instrument,
template=graphTemplate)
fig.for_each_trace(
lambda t: t.update(
visible=True if t.name in IPADfGraph.columns[:1] else "legendonly"))
fig.show()
elif graphStyle == '3 graphs':
fig = plotly.subplots.make_subplots(rows=3, cols=1)
fig.add_trace(go.Scatter(
x=IPADf.index, y=IPADfGraph.ImpliedVolatility,
name='Op Imp Volatility'), row=1, col=1)
fig.add_trace(go.Scatter(
x=IPADf.index, y=IPADfGraph.MarketValueInDealCcy,
name='Op Mk Pr'), row=2, col=1)
fig.add_trace(go.Scatter(
x=IPADf.index, y=IPADfGraph.UnderlyingPrice,
name=f"{underlying} Undrlyg Pr"), row=3, col=1)
fig.update(layout_xaxis_rangeslider_visible=False)
fig.update_layout(title=IPADfGraph.columns.name)
fig.update_layout(
title=instrument,
template=graphTemplate,
autosize=False,
width=1300,
height=500)
fig.show()
else:
print("Looks like the agrument `graphStyle` used is incorrect. Try `simple`, `overlay` or `3 graphs`")
ImpVolatilityCalcIPA( # This will pick up 10 min data
x=15,
indexUnderlying=".SPX", # ".SPX" or ".STOXX50E"
callOrPut='Call', # 'Put' or 'Call'
dateBack=3,
expiryYearOfInterest=datetime.now().year,
riskFreeRate=None,
riskFreeRateField=None, # 'TR.FIXINGVALUE'
timeZoneInGraph=datetime.now().astimezone(),
maxColwidth=200,
graphStyle='overlay', # 'overlay', '3 graphs', 'simple'
simpleGraphLineStyle='.-', # 'o-'
simpleGraphSize=(15, 5),
graphTemplate='plotly_dark',
debug=False,
returnDfGraph=True,
AtOptionTradeOnly=True)
| DocumentTitle | RIC | StrikePrice | ExchangeCode | ExpiryDate | UnderlyingQuoteRIC | InsertDateTime | RetireDate | |
|---|---|---|---|---|---|---|---|---|
| 386 | OPRA S&P 500 Index Option 3970 Call Apr 2023 , Stock Index Cash Option, Call 3970 USD 21-Apr-2023, OPRA | SPXWd212339700.U | 3970 | OPQ | 2023-04-21 | [.SPX] | 2023-03-09 04:00:19 | 2023-04-25 |
If you're interested in running this function in a loop that uppdates every 5 seconds, the below cell is for you. I am not running it here as it is an infinite loop, meaning taht it will not stop running, and won't allow subsequent cells to run.
# while True:
# # Code executed here
# clear_output(wait=True)
# ImpVolatilityCalcIPA(
# dateBack=3, indexUnderlying=".STOXX50E", callOrPut='Call',
# graphStyle='simple', AtOptionTradeOnly=True)
# time.sleep(5)
The code in the cell below was written expertly by Haykaz Aramyan in the article 'Functions to find Option RICs traded on different exchanges'. I wanted to introduce it towards the end of this (current) article as it uses complex Python notions such as Classes. We look into reconstructing expiered option RICs which have different nomenclatures to live ones:
Below, we put ourselves in the shoes of an analyst backtesting a strategy involving past historical Implied Volatilities. E.g.: if the average 3-business-day historical Implied Volatility of an Option contract is too high, (s)he would not consider it in his(/her) portfolio.
Somwthing to keep in mind is that no intraday price data is available for options that expired 3 months (or more) prior, therefore, when intraday data is not available, daily data will be used.
Let's focuss on STOXX50E.
We are applying simillar logic to what was seen before, above. As a result, we'll use the same object names and simply add 2, from indexUnderlying2 onwards:
timeOfCalc2, indexUnderlying2 = "2022-04-01", ".STOXX50E"
timeOfCalcDatetime2 = datetime.strptime(timeOfCalc2, '%Y-%m-%d')
currentUnderlyingPrc2 = rd.get_history(
universe=[indexUnderlying2],
start=timeOfCalc2, # , end: "OptDateTime"=None
fields=["TRDPRC_1"],
interval="tick").iloc[-1][0]
currentUnderlyingPrc2
4214.51
if indexUnderlying2 == ".STOXX50E":
exchangeC2, exchangeRIC2, mcalGetCalendar2 = "EUX", "STX", "EUREX"
elif indexUnderlying2 == ".SPX":
exchangeC2, exchangeRIC2, mcalGetCalendar2 = "OPQ", "SPX", "CBOE_Futures"
exchangeC2, exchangeRIC2, mcalGetCalendar2
('EUX', 'STX', 'EUREX')
Now we can ge the expiry dates for our new senario, based on a time of calculation on "2022-04-01":
fullDatesAtTimeOfCalc2 = Get_exp_dates(
year=2022, days=False,
mcal_get_calendar=mcalGetCalendar2)
fullDatesAtTimeOfCalc2
{2022: [datetime.date(2022, 1, 21),
datetime.date(2022, 2, 18),
datetime.date(2022, 3, 18),
datetime.date(2022, 4, 14),
datetime.date(2022, 5, 20),
datetime.date(2022, 6, 17),
datetime.date(2022, 7, 15),
datetime.date(2022, 8, 19),
datetime.date(2022, 9, 16),
datetime.date(2022, 10, 21),
datetime.date(2022, 11, 18),
datetime.date(2022, 12, 16)]}
fullDatesAtTimeOfCalcDatetime2 = [
datetime(i.year, i.month, i.day)
for i in fullDatesAtTimeOfCalc2[list(fullDatesAtTimeOfCalc2.keys())[0]]]
fullDatesAtTimeOfCalcDatetime2
[datetime.datetime(2022, 1, 21, 0, 0), datetime.datetime(2022, 2, 18, 0, 0), datetime.datetime(2022, 3, 18, 0, 0), datetime.datetime(2022, 4, 14, 0, 0), datetime.datetime(2022, 5, 20, 0, 0), datetime.datetime(2022, 6, 17, 0, 0), datetime.datetime(2022, 7, 15, 0, 0), datetime.datetime(2022, 8, 19, 0, 0), datetime.datetime(2022, 9, 16, 0, 0), datetime.datetime(2022, 10, 21, 0, 0), datetime.datetime(2022, 11, 18, 0, 0), datetime.datetime(2022, 12, 16, 0, 0)]
expiryDateOfInt2 = [i for i in fullDatesAtTimeOfCalcDatetime2
if i > timeOfCalcDatetime2 + relativedelta(days=x)][0]
expiryDateOfInt2
datetime.datetime(2022, 5, 20, 0, 0)
We'll need new functions Get_exp_month and Check_ric:
def Get_exp_month(exp_date, opt_type):
# define option expiration identifiers
ident = {
'1': {'exp': 'A', 'C': 'A', 'P': 'M'},
'2': {'exp': 'B', 'C': 'B', 'P': 'N'},
'3': {'exp': 'C', 'C': 'C', 'P': 'O'},
'4': {'exp': 'D', 'C': 'D', 'P': 'P'},
'5': {'exp': 'E', 'C': 'E', 'P': 'Q'},
'6': {'exp': 'F', 'C': 'F', 'P': 'R'},
'7': {'exp': 'G', 'C': 'G', 'P': 'S'},
'8': {'exp': 'H', 'C': 'H', 'P': 'T'},
'9': {'exp': 'I', 'C': 'I', 'P': 'U'},
'10': {'exp': 'J', 'C': 'J', 'P': 'V'},
'11': {'exp': 'K', 'C': 'K', 'P': 'W'},
'12': {'exp': 'L', 'C': 'L', 'P': 'X'}}
# get expiration month code for a month
if opt_type.upper() == 'C':
exp_month = ident[str(exp_date.month)]['C']
elif opt_type.upper() == 'P':
exp_month = ident[str(exp_date.month)]['P']
return ident, exp_month
def Check_ric(ric, maturity, ident):
exp_date = pd.Timestamp(maturity)
# get start and end date for get_historical_price_summaries
# query (take current date minus 90 days period)
sdate = (datetime.now() - timedelta(90)).strftime('%Y-%m-%d')
edate = datetime.now().strftime('%Y-%m-%d')
# check if option is matured. If yes, add expiration syntax and recalculate
# start and end date of the query (take expiration day minus 90 days period)
if pd.Timestamp(maturity) < datetime.now():
ric = ric + '^' + ident[str(exp_date.month)]['exp'] + str(exp_date.year)[-2:]
sdate = (exp_date - timedelta(90)).strftime('%Y-%m-%d')
edate = exp_date.strftime('%Y-%m-%d')
# request option prices. Please note, there is no settle price for OPRA traded options
fieldsRequest = ['BID', 'ASK', 'TRDPRC_1']
if not ric.split('.')[1][0] == 'U':
fieldsRequest.append('SETTLE')
prices = rd.content.historical_pricing.summaries.Definition(
ric, start=sdate, end=edate,
interval=rd.content.historical_pricing.Intervals.DAILY,
fields=fieldsRequest).get_data()
return ric, prices
Now we can get EUREX RIC:
def Get_ric_eurex(asset, maturity, strike, opt_type):
exp_date = pd.Timestamp(maturity)
if asset[0] == '.':
asset_name = asset[1:]
if asset_name == 'FTSE':
asset_name = 'OTUK'
elif asset_name == 'SSMI':
asset_name = 'OSMI'
elif asset_name == 'GDAXI':
asset_name = 'GDAX'
elif asset_name == 'ATX':
asset_name = 'FATXA'
elif asset_name == 'STOXX50E':
asset_name = 'STXE'
else:
asset_name = asset.split('.')[0]
ident, exp_month = Get_exp_month(
exp_date=exp_date, opt_type=opt_type)
if type(strike) == float:
int_part = int(strike)
dec_part = str(str(strike).split('.')[1])[0]
else:
int_part = int(strike)
dec_part = '0'
if len(str(int(strike))) == 1:
strike_ric = '0' + str(int_part) + dec_part
else:
strike_ric = str(int_part) + dec_part
possible_rics = []
generations = ['', 'a', 'b', 'c', 'd']
for gen in generations:
ric = asset_name + strike_ric + gen + exp_month + str(exp_date.year)[-1:] + '.EX'
ric, prices = Check_ric(ric, maturity, ident)
if prices is not None:
return ric, prices
else:
possible_rics.append(ric)
print(f'Here is a list of possible RICs {possible_rics}, however we could not find any prices for those!')
return ric, prices
Note that this function, Get_ric_eurex, needs a round number as strike price:
int(round(currentUnderlyingPrc2, -2))
4200
instrument2, instrument2Prices = Get_ric_eurex(
asset='.STOXX50E', opt_type='P',
maturity=expiryDateOfInt2.strftime('%Y-%m-%d'),
strike=int(round(currentUnderlyingPrc2, -2)))
instrument2
'STXE42000Q2.EX^E22'
instrument2Prices.data.df
| STXE42000Q2.EX^E22 | BID | ASK | TRDPRC_1 | SETTLE |
|---|---|---|---|---|
| Date | ||||
| 2022-02-21 | 349.2 | 360.3 | <NA> | 353.8 |
| 2022-02-22 | 344.8 | 357.2 | <NA> | 349.3 |
| 2022-02-23 | 351.2 | 365.7 | <NA> | 358.9 |
| 2022-02-24 | 466.1 | 486.0 | 508.2 | 476.6 |
| 2022-02-25 | 353.0 | 366.0 | <NA> | 357.0 |
| ... | ... | ... | ... | ... |
| 2022-05-16 | 511.5 | 543.3 | NaN | 527.1 |
| 2022-05-17 | 458.2 | 471.0 | NaN | 465.0 |
| 2022-05-18 | 504.7 | 534.2 | NaN | 518.8 |
| 2022-05-19 | 551.8 | 585.1 | NaN | 569.7 |
| 2022-05-20 | 477.6 | 540.8 | NaN | 500.2 |
63 rows × 4 columns
Above we looked at the specific usecase with EUREX, let's generalise it:
from typing import Tuple, Union, Dict, List, Any
class Option_RIC:
"""
Option_RIC
"""
def __init__(
self,
maturity: str, # '2022-01-21'
strike: int,
opt_type: str, # 'C' or 'P'
asset: str = ".STOXX50E",
debug: bool = False,
topNuSearchResults: int = 100):
# Most objects are simple to define at this stage, but soon you'll see that two of them are a little more finicky
self.maturity = pd.Timestamp(maturity)
self.strike = strike
self.opt_type = opt_type
self.debug = debug
self.asset = asset
response = search.Definition(
query=asset,
filter="SearchAllCategory eq 'Options' and Periodicity eq 'Monthly' ",
select=' RIC, DocumentTitle, UnderlyingQuoteRIC,Periodicity, ExchangeCode',
navigators="ExchangeCode",
top=topNuSearchResults).get_data()
result = response.data.raw["Navigators"]["ExchangeCode"]
exchange_codes = []
for i in range(len(result['Buckets'])):
code = result['Buckets'][i]['Label']
exchange_codes.append(code)
self.exchange = exchange_codes
def Check_ric(self, ric, maturity):
"""
Support Function used within other functions in `Option_RIC` Class.
"""
exp_date = pd.Timestamp(maturity)
if pd.Timestamp(maturity) < datetime.now():
sdate = (exp_date - timedelta(600)).strftime('%Y-%m-%d')
edate = exp_date.strftime('%Y-%m-%d')
else:
sdate = (datetime.now() - timedelta(90)).strftime('%Y-%m-%d')
edate = datetime.now().strftime('%Y-%m-%d')
if self.debug:
print(f"Check_ric's (ric, sdate, edate) = ({ric}, {sdate}, {edate})")
# Now things are gettijng tricky.
# Certain Expiered Options do not have 'TRDPRC_1' data historically. Some don't have 'SETTLE'. Some have both...
# The below should capture 'SETTLE' when it is available, but 'TRDPRC_1' might still be present in these instances.
# So we will need to build a logic to focus on the series with the most datapoints.
if ric.split('.')[1][0] == 'U':
prices = rd.content.historical_pricing.summaries.Definition(
ric, start=sdate, end=edate,
interval=rd.content.historical_pricing.Intervals.DAILY,
fields=['TRDPRC_1', 'BID', 'ASK']).get_data() # Later in the code, we will pick the column that has the fewest `<NA>`s. It oculd be that 'BID' and 'TRDPRC_1' have just as many `<NA>`s; the code will pick the 1st column with the same number of `<NA>`s in this case, so we have to ask for fields in the order of importance. Here we're most interested in 'TRDPRC_1', the rest later.
else:
prices = rd.content.historical_pricing.summaries.Definition(
ric, start=sdate, end=edate,
interval=rd.content.historical_pricing.Intervals.DAILY,
fields=['SETTLE', 'TRDPRC_1', 'BID', 'ASK']).get_data()
if self.debug:
print(f"prices.data.df.isna().sum(axis=0).idxmin() = {prices.data.df.isna().sum(axis=0).idxmin()}")
print(f"prices.data.df[prices.data.df.isna().sum(axis=0).idxmin()] = {prices.data.df[prices.data.df.isna().sum(axis=0).idxmin()]}")
fullest_prices = pd.DataFrame( # Deppending on which Option (and underlying) user picks, the apropriate price field changes - annoyingly. `fullest_prices` attempts to pick only the collumn - and thus the field - with fewest `NaN`s, which ought to be the correct field/column.
columns=[prices.data.df.isna().sum(axis=0).idxmin()], # This is the name of the column with fewest NAs
data=prices.data.df[prices.data.df.isna().sum(axis=0).idxmin()]) # This is the column with fewest NAs
return ric, prices, fullest_prices
def Get_asset_and_exchange(self):
"""
Support Function used within other functions in `Option_RIC` Class.
"""
asset_in_ric: Dict[str, Dict[Union[str, Dict[str, str]]]] = {
'SSMI': {'EUX': 'OSMI'},
'GDAXI': {'EUX': 'GDAX'},
'ATX': {'EUX': 'FATXA'},
'STOXX50E': {'EUX': 'STXE'},
'FTSE': {'IEU': 'LFE', 'EUX': 'OTUK'},
'N225': {'OSA': 'JNI'},
'TOPX': {'OSA': 'JTI'}}
asset_exchange: Dict[str, str] = {}
if self.asset[0] != '.':
asset: str = self.asset.split('.')[0]
else:
asset: str = self.asset[1:]
for exch in self.exchange:
if asset in asset_in_ric:
asset_exchange[exch] = asset_in_ric[asset][exch]
else:
asset_exchange[exch] = asset
return asset_exchange
def Get_strike(self, exch):
if exch == 'OPQ':
if type(self.strike) == float:
int_part = int(self.strike)
dec_part = str(str(self.strike).split('.')[1])
else:
int_part = int(self.strike)
dec_part = '00'
if int(self.strike) < 10:
strike_ric = '00' + str(int_part) + dec_part
elif int_part >= 10 and int_part < 100:
strike_ric = '0' + str(int_part) + dec_part
elif int_part >= 100 and int_part < 1000:
strike_ric = str(int_part) + dec_part
elif int_part >= 1000 and int_part < 10000:
strike_ric = str(int_part) + '0'
elif int_part >= 10000 and int_part < 20000:
strike_ric = 'A' + str(int_part)[-4:]
elif int_part >= 20000 and int_part < 30000:
strike_ric = 'B' + str(int_part)[-4:]
elif int_part >= 30000 and int_part < 40000:
strike_ric = 'C' + str(int_part)[-4:]
elif int_part >= 40000 and int_part < 50000:
strike_ric = 'D' + str(int_part)[-4:]
elif exch == 'HKG' or exch == 'HFE':
if self.asset[0] == '.':
strike_ric = str(int(self.strike))
else:
strike_ric = str(int(self.strike * 100))
elif exch == 'OSA':
strike_ric = str(self.strike)[:3]
elif exch == 'EUX' or exch == 'IEU':
if type(self.strike) == float and len(str(int(self.strike))) == 1:
int_part = int(self.strike)
dec_part = str(str(self.strike).split('.')[1])[0]
strike_ric = '0' + str(int_part) + dec_part
elif (len(str(int(self.strike))) > 1 and exch == 'EUX'):
strike_ric = str(int(self.strike)) + '0'
elif (len(str(int(self.strike))) == 2 and exch == 'IEU'):
strike_ric = '0' + str(int(self.strike))
elif len(str(int(self.strike))) > 2 and exch == 'IEU':
strike_ric = str(int(self.strike))
return strike_ric
def Get_exp_month(self, exchange):
"""
Support Function used within other functions in `Option_RIC` Class.
"""
ident_opra = {
'1': {'exp': 'A', 'C_bigStrike': 'a', 'C_smallStrike': 'A',
'P_bigStrike': 'm', 'P_smallStrike': 'M'},
'2': {'exp': 'B', 'C_bigStrike': 'b', 'C_smallStrike': 'B',
'P_bigStrike': 'n', 'P_smallStrike': 'N'},
'3': {'exp': 'C', 'C_bigStrike': 'c', 'C_smallStrike': 'C',
'P_bigStrike': 'o', 'P_smallStrike': 'O'},
'4': {'exp': 'D', 'C_bigStrike': 'd', 'C_smallStrike': 'D',
'P_bigStrike': 'p', 'P_smallStrike': 'P'},
'5': {'exp': 'E', 'C_bigStrike': 'e', 'C_smallStrike': 'E',
'P_bigStrike': 'q', 'P_smallStrike': 'Q'},
'6': {'exp': 'F', 'C_bigStrike': 'f', 'C_smallStrike': 'F',
'P_bigStrike': 'r', 'P_smallStrike': 'R'},
'7': {'exp': 'G', 'C_bigStrike': 'g', 'C_smallStrike': 'G',
'P_bigStrike': 's', 'P_smallStrike': 'S'},
'8': {'exp': 'H', 'C_bigStrike': 'h', 'C_smallStrike': 'H',
'P_bigStrike': 't', 'P_smallStrike': 'T'},
'9': {'exp': 'I', 'C_bigStrike': 'i', 'C_smallStrike': 'I',
'P_bigStrike': 'u', 'P_smallStrike': 'U'},
'10': {'exp': 'J', 'C_bigStrike': 'j', 'C_smallStrike': 'J',
'P_bigStrike': 'v', 'P_smallStrike': 'V'},
'11': {'exp': 'K', 'C_bigStrike': 'k', 'C_smallStrike': 'K',
'P_bigStrike': 'w', 'P_smallStrike': 'W'},
'12': {'exp': 'L', 'C_bigStrike': 'l', 'C_smallStrike': 'L',
'P_bigStrike': 'x', 'P_smallStrike': 'X'}}
ident_all = {
'1': {'exp': 'A', 'C': 'A', 'P': 'M'},
'2': {'exp': 'B', 'C': 'B', 'P': 'N'},
'3': {'exp': 'C', 'C': 'C', 'P': 'O'},
'4': {'exp': 'D', 'C': 'D', 'P': 'P'},
'5': {'exp': 'E', 'C': 'E', 'P': 'Q'},
'6': {'exp': 'F', 'C': 'F', 'P': 'R'},
'7': {'exp': 'G', 'C': 'G', 'P': 'S'},
'8': {'exp': 'H', 'C': 'H', 'P': 'T'},
'9': {'exp': 'I', 'C': 'I', 'P': 'U'},
'10': {'exp': 'J', 'C': 'J', 'P': 'V'},
'11': {'exp': 'K', 'C': 'K', 'P': 'W'},
'12': {'exp': 'L', 'C': 'L', 'P': 'X'}}
if exchange == 'OPQ':
if self.strike > 999.999:
exp_month_code = ident_opra[str(
self.maturity.month)][self.opt_type + '_bigStrike']
else:
exp_month_code = ident_opra[str(
self.maturity.month)][self.opt_type + '_smallStrike']
else:
exp_month_code = ident_all[str(self.maturity.month)][self.opt_type]
if self.maturity < datetime.now():
expired = '^' + \
ident_all[str(self.maturity.month)]['exp'] + \
str(self.maturity.year)[-2:]
else:
expired = ''
return exp_month_code, expired
def RIC_prices(self, ric, ricPrices):
if self.debug:
print(f"ricPrices's ric: {ric}")
print(f"self.maturity: {self.maturity}")
ric, prices, full_prices = self.Check_ric(ric, self.maturity)
if prices is not None:
valid_ric = {ric: prices, f'{ric} fullest prices': full_prices}
ricPrices['valid_ric'].append(valid_ric)
else:
ricPrices['potential_rics'].append(ric)
return ricPrices
def Construct_RIC(self):
asset_exchange = self.Get_asset_and_exchange()
supported_exchanges = ['OPQ', 'IEU', 'EUX', 'HKG', 'HFE', 'OSA']
ricPrices = {'valid_ric': [], 'potential_rics': []}
for exchange, asset in asset_exchange.items():
if exchange in supported_exchanges:
strike_ric = self.Get_strike(exchange)
exp_month_code, expired = self.Get_exp_month(exchange)
if exchange == 'OPQ':
ric = asset + exp_month_code + \
str(self.maturity.day) + \
str(self.maturity.year)[-2:] + \
strike_ric + '.U' + expired
ricPrices = self.RIC_prices(ric, ricPrices)
elif exchange == 'HKG' or exchange == 'HFE':
gen_len = ['0', '1', '2', '3']
if exchange == 'HFE':
gen_len = ['']
for i in gen_len:
exchs = {'HKG': {'exch_code': '.HK', 'gen': str(i)},
'HFE': {'exch_code': '.HF', 'gen': ''}}
ric = asset + strike_ric + exchs[exchange]['gen'] + exp_month_code + str(
self.maturity.year)[-1:] + exchs[exchange]['exch_code'] + expired
ricPrices = self.RIC_prices(ric, ricPrices)
elif exchange == 'OSA':
for jnet in ['', 'L', 'R']:
if self.asset[0] == '.':
ric = asset + jnet + strike_ric + exp_month_code + \
str(self.maturity.year)[-1:] + '.OS' + expired
ricPrices = self.RIC_prices(ric, ricPrices)
else:
for gen in ['Y', 'Z', 'A', 'B', 'C']:
ric = asset + jnet + gen + strike_ric + exp_month_code + \
str(self.maturity.year)[-1:] + \
'.OS' + expired
ricPrices = self.RIC_prices(ric, ricPrices)
elif exchange == 'EUX' or exchange == 'IEU':
exchs = {'EUX': '.EX', 'IEU': '.L'}
for gen in ['', 'a', 'b', 'c', 'd']:
ric = asset + strike_ric + gen + exp_month_code + \
str(self.maturity.year)[-1:] + \
exchs[exchange] + expired
if self.debug: print(f"Construct_RIC's ric: {ric}")
try:
ricPrices = self.RIC_prices(ric, ricPrices)
except:
if self.debug:
print("Error for self.RIC_prices(ric, ricPrices)")
else:
print(f'The {exchange} exchange is not supported yet')
return ricPrices
Let's try it all again with the senario where we calculate values as of '2023-02-01' for index 'Hang Seng Index':
timeOfCalc3 = "2023-02-01"
indexUnderlying3 = ".HSI"
timeOfCalcDatetime2 = datetime.strptime(timeOfCalc3, '%Y-%m-%d')
currentUnderlyingPrc2 = rd.get_history(
universe=[indexUnderlying3],
start=timeOfCalc3, # , end: "OptDateTime"=None
fields=["TRDPRC_1"],
interval="tick").iloc[-1][0]
currentUnderlyingPrc2
20192.4
We'll have to round our strike price:
int(round(currentUnderlyingPrc2, -2))
20200
HSI_test1 = Option_RIC(
maturity=timeOfCalc3,
strike=int(round(currentUnderlyingPrc2, -2)),
opt_type='P',
asset=indexUnderlying3,
debug=False)
HSI_test2 = HSI_test1.Construct_RIC()
list(HSI_test2['valid_ric'][0].keys())[0]
'HSI20200N3.HF^B23'
HSI_test3 = HSI_test2['valid_ric'][0][list(HSI_test2['valid_ric'][0].keys())[0]].data.df.head()
HSI_test3
| HSI20200N3.HF^B23 | SETTLE | TRDPRC_1 | BID | ASK |
|---|---|---|---|---|
| Date | ||||
| 2022-11-08 | 3569 | <NA> | <NA> | <NA> |
| 2022-11-09 | 3713 | <NA> | <NA> | <NA> |
| 2022-11-10 | 4025 | <NA> | <NA> | <NA> |
| 2022-11-11 | 2939 | <NA> | <NA> | <NA> |
| 2022-11-14 | 2665 | <NA> | <NA> | <NA> |
print(list(HSI_test2['valid_ric'][0].keys())[1])
HSI_test4 = HSI_test2['valid_ric'][0][list(HSI_test2['valid_ric'][0].keys())[1]]
HSI_test4.head()
HSI20200N3.HF^B23 fullest prices
| SETTLE | |
|---|---|
| Date | |
| 2022-11-08 | 3569 |
| 2022-11-09 | 3713 |
| 2022-11-10 | 4025 |
| 2022-11-11 | 2939 |
| 2022-11-14 | 2665 |
Now let's look at SPX:
timeOfCalc3 = '2022-02-10'
indexUnderlying3 = ".SPX"
timeOfCalcDatetime3 = datetime.strptime(timeOfCalc3, '%Y-%m-%d')
currentUnderlyingPrc3 = rd.get_history(
universe=[indexUnderlying3],
start=timeOfCalc3, # , end: "OptDateTime"=None
fields=["TRDPRC_1"],
interval="tick").iloc[-1][0]
currentUnderlyingPrc3
3971.27
SPX_test2 = Option_RIC(
maturity='2022-01-21',
strike=int(round(currentUnderlyingPrc3, -2)),
opt_type='P',
asset=indexUnderlying3,
debug=False)
SPX_test2 = SPX_test2.Construct_RIC()
list(SPX_test2['valid_ric'][0].keys())[0]
'SPXm212240000.U^A22'
SPX_test2['valid_ric'][0][list(SPX_test2['valid_ric'][0].keys())[0]].data.df
| SPXm212240000.U^A22 | TRDPRC_1 | BID | ASK |
|---|---|---|---|
| Date | |||
| 2020-12-21 | <NA> | 484.50 | 494.00 |
| 2020-12-22 | <NA> | 491.10 | 495.10 |
| 2020-12-23 | 473.3 | 485.10 | 488.80 |
| 2020-12-24 | <NA> | 471.70 | 477.40 |
| 2020-12-28 | <NA> | 449.60 | 453.20 |
| ... | ... | ... | ... |
| 2022-01-13 | 0.57 | 0.55 | 0.65 |
| 2022-01-14 | 0.4 | 0.35 | 0.50 |
| 2022-01-18 | 0.4 | 0.35 | 0.45 |
| 2022-01-19 | 0.25 | 0.20 | 0.30 |
| 2022-01-20 | 0.15 | 0.05 | 0.15 |
273 rows × 3 columns
print(list(SPX_test2['valid_ric'][0].keys())[1])
SPX_test2['valid_ric'][0][list(SPX_test2['valid_ric'][0].keys())[1]]
SPXm212240000.U^A22 fullest prices
| BID | |
|---|---|
| Date | |
| 2020-12-21 | 484.50 |
| 2020-12-22 | 491.10 |
| 2020-12-23 | 485.10 |
| 2020-12-24 | 471.70 |
| 2020-12-28 | 449.60 |
| ... | ... |
| 2022-01-13 | 0.55 |
| 2022-01-14 | 0.35 |
| 2022-01-18 | 0.35 |
| 2022-01-19 | 0.20 |
| 2022-01-20 | 0.05 |
273 rows × 1 columns
The most granular Historical Options' price data kept are daily time-series. This daily data is captured buy the above Option_RIC().Construct_RIC() function. Some Options' historical price data is most "wholesome" (in this case, "has the least amount of NaNs" - Not a Number) under the field name TRDPRC_1, some under SETTLE. While our preference - ceteris paribus (all else equal) - is TRDPRC_1, more "wholesome" data-sets are still preferable, so the "fullest prices" in Option_RIC().Construct_RIC() picks the series with fewest NaNs.
list(HSI_test2['valid_ric'][0].keys())[0]
'HSI20200N3.HF^B23'
HSICurr = rd.get_data(
universe='.HSI',
fields=["CF_CURR"])
HSICurr
| Instrument | CF_CURR | |
|---|---|---|
| 0 | .HSI | 344 |
# Now we will try and find the strike price for the option found.
# If we were to use the logic above, in the function `Option_RIC().Get_strike()`, we would find a list of all the possible prices for options on this underlying, which is too large of a group.
# We will use the name of the outputed option, which includes the strike:
import re # native Python library that allows us to manipulate strings
hist_opt_found_strk_pr = re.findall(
'(\d+|[A-Za-z]+)', # This will split the string out of its numerical and non-numerical characters.
list(HSI_test2['valid_ric'][0].keys())[0])[1][0:-1] # `[1]` here skips through 'HSI' and to the numbers. `[0:-1]` here is there to ignore the last digit, which is not part of the strike price.
hist_opt_found_strk_pr
'2020'
hk_rf = 100 - rd.get_history(
universe=['HK3MT=RR'], # HK10YGB=EODF, HKGOV3MZ=R, HK3MT=RR
fields=['TR.MIDPRICE'],
start=HSI_test4.index[0].strftime('%Y-%m-%d'),
end=HSI_test4.index[-1].strftime('%Y-%m-%d')) # .iloc[::-1] # `.iloc[::-1]` is here so that the resulting data-frame is the same order as `HSI_test5` so we can merge them later
hk_rf
| HK3MT=RR | Mid Price |
|---|---|
| Date | |
| 2023-02-01 | 0.5925 |
| 2023-01-31 | 0.5745 |
| 2023-01-30 | 0.523 |
| 2023-01-27 | 0.532 |
| 2023-01-26 | 0.577 |
| ... | ... |
| 2022-11-14 | 0.71 |
| 2022-11-11 | 0.713 |
| 2022-11-10 | 0.731 |
| 2022-11-09 | 0.7365 |
| 2022-11-08 | 0.7445 |
62 rows × 1 columns
HSI_test5 = pd.merge(
HSI_test4, hk_rf,
left_index=True, right_index=True)
HSI_test5 = HSI_test5.rename(
columns={"SETTLE": "OptionPrice", "Mid Price": "RfRatePrct"})
HSI_test5.head()
| OptionPrice | RfRatePrct | |
|---|---|---|
| Date | ||
| 2022-11-08 | 3569 | 0.7445 |
| 2022-11-09 | 3713 | 0.7365 |
| 2022-11-10 | 4025 | 0.731 |
| 2022-11-11 | 2939 | 0.713 |
| 2022-11-14 | 2665 | 0.71 |
hist_HSI_undrlying_pr = rd.get_history(
universe=['.HSI'],
fields=["TRDPRC_1"],
# interval="1D",
start=HSI_test4.index[0].strftime('%Y-%m-%d'),
end=HSI_test4.index[-1].strftime('%Y-%m-%d')) # .iloc[::-1] # `.iloc[::-1]` is here so that the resulting data-frame is the same order as `HSI_test5` so we can merge them later
hist_HSI_undrlying_pr.head(2)
| .HSI | TRDPRC_1 |
|---|---|
| Date | |
| 2022-11-09 | 16358.52 |
| 2022-11-10 | 16081.04 |
HSI_test6 = pd.merge(HSI_test5, hist_HSI_undrlying_pr,
left_index=True, right_index=True)
HSI_test6 = HSI_test6.rename(
columns={"TRDPRC_1": "UndrlyingPr"})
HSI_test6.columns.name = list(HSI_test2['valid_ric'][0].keys())[0] # This is to name the data-frame. Technically it names the column set, but the're all just for one instrument, so it's the same difference.
HSI_test6.head(2)
| HSI20200N3.HF^B23 | OptionPrice | RfRatePrct | UndrlyingPr |
|---|---|---|---|
| Date | |||
| 2022-11-09 | 3713 | 0.7365 | 16358.52 |
| 2022-11-10 | 4025 | 0.731 | 16081.04 |
list(HSI_test2['valid_ric'][0].keys())[0]
'HSI20200N3.HF^B23'
HSI_test_start = HSI_test2['valid_ric'][0][list(HSI_test2['valid_ric'][0].keys())[1]].index[0]
# HSI_test_end = HSI_test2['valid_ric'][0][list(HSI_test2['valid_ric'][0].keys())[1]].index[-1]
(HSI_test1.maturity - HSI_test_start).days/365 # Expecting this to `YearsToExpiry` which is in 'DaysToExpiry / 365'.
0.2328767123287671
HSI_test2['valid_ric'][0][
f"{list(HSI_test2['valid_ric'][0].keys())[0]} fullest prices"].head(2)
| SETTLE | |
|---|---|
| Date | |
| 2022-11-08 | 3569 |
| 2022-11-09 | 3713 |
HSI_test2['valid_ric'][0][list(HSI_test2['valid_ric'][0].keys())[0]].data.df.head(2)
| HSI20200N3.HF^B23 | SETTLE | TRDPRC_1 | BID | ASK |
|---|---|---|---|---|
| Date | ||||
| 2022-11-08 | 3569 | <NA> | <NA> | <NA> |
| 2022-11-09 | 3713 | <NA> | <NA> | <NA> |
# rd.content.historical_pricing.summaries.Definition(
# list(HSI_test2['valid_ric'][0].keys())[0],
# start='2022-01-01',
# end='2024-02-27',
# interval=rd.content.historical_pricing.Intervals.DAILY,
# fields=['TRDPRC_1', 'BID', 'ASK', 'EXPIR_DATE', 'TR.FOFirstTradingDate']).get_data().data.df
HSI_test2_exp_date = HSI_test1.maturity.strftime('%Y-%m-%d')
HSI_test2_exp_date
'2023-02-01'
# df = [
# option.Definition(
# underlying_type=option.UnderlyingType.ETI,
# buy_sell='Buy',
# instrument_code=list(HSI_test2['valid_ric'][0].keys())[0], # 'STXE42000D3.EX' # 'HSI19300N3.HF^B23', # list(HSI_test2['valid_ric'][0].keys())[0],
# strike=float(hist_opt_found_strk_pr),
# pricing_parameters=option.PricingParameters(
# market_value_in_deal_ccy=float(HSI_test6['OptionPrice'][i]),
# risk_free_rate_percent=float(HSI_test6['RfRatePrct'][i]),
# underlying_price=float(HSI_test6['UndrlyingPr'][i]),
# pricing_model_type='BlackScholes',
# volatility_type='Implied',
# underlying_time_stamp='Default',
# report_ccy='HKD'
# ))
# for i in range(len(HSI_test6.index))]
option.Definition
hist_daily_universe_l = [
option.Definition(
instrument_tag='hist_daily_universe_l',
end_date=HSI_test2_exp_date,
buy_sell='Buy',
call_put='Call',
exercise_style='AMER', # 'EURO'
underlying_type=option.UnderlyingType.ETI,
strike=float(hist_opt_found_strk_pr),
tenor=str((HSI_test1.maturity - HSI_test_start).days/365), # Expecting this to `YearsToExpiry` which is in 'DaysToExpiry / 365'.
notional_ccy='HKD',
# notional_amount=,
# asian_definition=,
# barrier_definition=,
# binary_definition=,
# double_barrier_definition=,
# double_binary_definition=,
# dual_currency_definition=,
# forward_start_definition=,
# underlying_definition=,
delivery_date=HSI_test1.maturity.strftime('%Y-%m-%d'),
instrument_code='STXE42000D3.EX', # 'STXE42000D3.EX' # 'HSI19300N3.HF^B23', # list(HSI_test2['valid_ric'][0].keys())[0],
# cbbc_definition=,
# double_barriers_definition=,
# deal_contract=,
# end_date_time=,
# lot_size=,
# offset=,
# extended_params=,
pricing_parameters=option.PricingParameters(
valuation_date=HSI_test6.index[i].strftime('%Y-%m-%d'),
market_value_in_deal_ccy=float(HSI_test6['OptionPrice'][i]),
risk_free_rate_percent=float(HSI_test6['RfRatePrct'][i]),
underlying_price=float(HSI_test6['UndrlyingPr'][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='HKD'
))
for i in range(len(HSI_test6.index))]
# hist_daily_universe_l = [
# option.Definition(
# underlying_type=option.UnderlyingType.ETI,
# buy_sell='Buy',
# instrument_code=list(HSI_test2['valid_ric'][0].keys())[0], # 'STXE42000D3.EX' # 'HSI19300N3.HF^B23', # list(HSI_test2['valid_ric'][0].keys())[0],
# strike=float(hist_opt_found_strk_pr),
# pricing_parameters=option.PricingParameters(
# valuation_date=HSI_test6.index[i].strftime('%Y-%m-%d'),
# market_value_in_deal_ccy=float(HSI_test6['OptionPrice'][i]),
# risk_free_rate_percent=float(HSI_test6['RfRatePrct'][i]),
# underlying_price=float(HSI_test6['UndrlyingPr'][i]),
# pricing_model_type='BlackScholes',
# volatility_type='Implied',
# underlying_time_stamp='Default',
# report_ccy='HKD'
# ))
# for i in range(len(HSI_test6.index))]
batchOf = 100
for i, j in enumerate(Chunks(hist_daily_universe_l, batchOf)):
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(hist_daily_universe_l, 100)])} started")
# Example request with Body Parameter - Symbology Lookup
response6 = rdf.Definitions(universe=j, fields=requestFields)
response6 = response6.get_data()
if i == 0:
response6df = response6.data.df
else:
response6df = response6df.append(response6.data.df, ignore_index=True)
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(hist_daily_universe_l, 100)])} ended")
Batch of 61 requests no. 1/1 started
--------------------------------------------------------------------------- RDError Traceback (most recent call last) Input In [150], in <cell line: 2>() 4 # Example request with Body Parameter - Symbology Lookup 5 response6 = rdf.Definitions(universe=j, fields=requestFields) ----> 6 response6 = response6.get_data() 7 if i == 0: 8 response6df = response6.data.df File c:\users\u6082174.ten\appdata\local\programs\python\python38-32\lib\site-packages\refinitiv\data\content\ipa\_ipa_content_provider.py:134, in IPAContentProviderLayer.get_data(self, session, on_response, async_mode) 131 self._raise_error_no_async_url() 133 else: --> 134 response = super().get_data(session, on_response) 136 return response File c:\users\u6082174.ten\appdata\local\programs\python\python38-32\lib\site-packages\refinitiv\data\content\_content_provider.py:182, in ContentUsageLoggerMixin.get_data(self, session, on_response, **kwargs) 173 def get_data(self, session=None, on_response=None, **kwargs) -> T: 174 # Library usage logging 175 get_usage_logger().log_func( 176 name=f"{self.__class__.__module__}." 177 f"{self.__class__.__qualname__}." (...) 180 desc={FilterType.SYNC, FilterType.LAYER_CONTENT}, 181 ) --> 182 return super().get_data(session, on_response, **kwargs) File c:\users\u6082174.ten\appdata\local\programs\python\python38-32\lib\site-packages\refinitiv\data\delivery\_data\_data_provider.py:769, in DataProviderLayer.get_data(self, session, on_response, **kwargs) 765 response = self._provider.get_data( 766 session, url, auto_retry=auto_retry, **kwargs, **self._kwargs 767 ) 768 on_response and emit_event(on_response, response, self, session) --> 769 self._check_response(response, config) 770 return response File c:\users\u6082174.ten\appdata\local\programs\python\python38-32\lib\site-packages\refinitiv\data\delivery\_data\_data_provider.py:748, in DataProviderLayer._check_response(self, response, config) 747 def _check_response(self, response: BaseResponse, config): --> 748 _check_response(response, config) File c:\users\u6082174.ten\appdata\local\programs\python\python38-32\lib\site-packages\refinitiv\data\delivery\_data\_data_provider.py:636, in _check_response(response, config, response_class) 633 error = RDError(error_code, error_message) 635 error.response = response --> 636 raise error RDError: Error code 400 | Invalid input: Unbindable json. Could not find member 'tenor' on object of type 'EtiOptionDefinition'. Path 'tenor', line 1, position 112.
response6
HSIdf = response6df.copy()
HSIdf.rename(columns={"Volatility": 'ImpliedVolatility'}, inplace=True)
HSIdf
HSIdf[5:40]
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | ImpliedVolatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 5 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 6 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 7 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 8 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 9 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 10 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 11 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 12 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 13 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 14 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 15 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 16 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 17 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 18 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 19 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 20 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 21 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 22 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 23 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 24 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 25 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 26 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 27 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 28 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 29 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 30 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 31 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 32 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 33 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 34 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 35 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 36 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 37 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 38 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
| 39 | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> | <NA> |
HSI_test6_del_lay_list = [
{
"instrumentType": "Option",
"instrumentDefinition": {
"buySell": "Buy",
"underlyingType": "Eti",
"instrumentCode": list(HSI_test2['valid_ric'][0].keys())[0], # "instrumentCode": None,
"strike": float(hist_opt_found_strk_pr),
},
"pricingParameters": {
"marketValueInDealCcy": float(HSI_test6['OptionPrice'][i]),
"riskFreeRatePercent": float(HSI_test6['RfRatePrct'][i]),
"underlyingPrice": float(HSI_test6['UndrlyingPr'][i]),
"pricingModelType": "BlackScholes",
"dividendType": "ImpliedYield",
"volatilityType": "Implied",
"underlyingTimeStamp": "Default",
"reportCcy": "HKD"
}
}
for i in range(len(HSI_test6.index))]
batchOf = 100
for i, j in enumerate(Chunks(HSI_test6_del_lay_list, batchOf)):
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(HSI_test6_del_lay_list, batchOf)]))} started")
# Example request with Body Parameter - Symbology Lookup
request_definition = rd.delivery.endpoint_request.Definition(
method=rd.delivery.endpoint_request.RequestMethod.POST,
url='https://api.refinitiv.com/data/quantitative-analytics/v1/financial-contracts',
body_parameters={"fields": requestFields,
"outputs": ["Data", "Headers"],
"universe": j})
response8 = request_definition.get_data()
headers_name = [h['name'] for h in response8.data.raw['headers']]
if i == 0:
response8df = pd.DataFrame(
data=response8.data.raw['data'], columns=headers_name)
print({"fields": requestFields,
"outputs": ["Data", "Headers"],
"universe": j})
else:
_response8df = pd.DataFrame(
data=response8.data.raw['data'], columns=headers_name)
response8df = response8df.append(_response8df, ignore_index=True)
# display(_response8df)
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(HSI_test6_del_lay_list, batchOf)]))} ended")
Batch of 100 requests no. 1/1 started
{'fields': ['MarketValueInDealCcy', 'RiskFreeRatePercent', 'UnderlyingPrice', 'PricingModelType', 'DividendType', 'UnderlyingTimeStamp', 'ReportCcy', 'VolatilityType', 'Volatility', 'DeltaPercent', 'GammaPercent', 'RhoPercent', 'ThetaPercent', 'VegaPercent'], 'outputs': ['Data', 'Headers'], 'universe': [{'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 3352.0, 'riskFreeRatePercent': 0.7365000000000066, 'underlyingPrice': 16358.52, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 3656.0, 'riskFreeRatePercent': 0.7309999999999945, 'underlyingPrice': 16081.04, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2615.0, 'riskFreeRatePercent': 0.7129999999999939, 'underlyingPrice': 17325.66, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2355.0, 'riskFreeRatePercent': 0.7099999999999937, 'underlyingPrice': 17619.71, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1931.0, 'riskFreeRatePercent': 0.7964999999999947, 'underlyingPrice': 18343.12, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1984.0, 'riskFreeRatePercent': 0.7944999999999993, 'underlyingPrice': 18256.48, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2084.0, 'riskFreeRatePercent': 0.7860000000000014, 'underlyingPrice': 18045.66, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2092.0, 'riskFreeRatePercent': 0.7780000000000058, 'underlyingPrice': 17992.54, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2309.0, 'riskFreeRatePercent': 0.7920000000000016, 'underlyingPrice': 17655.91, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2441.0, 'riskFreeRatePercent': 0.9705000000000013, 'underlyingPrice': 17424.41, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2339.0, 'riskFreeRatePercent': 0.9814999999999969, 'underlyingPrice': 17523.81, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2237.0, 'riskFreeRatePercent': 0.9779999999999944, 'underlyingPrice': 17660.9, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2299.0, 'riskFreeRatePercent': 0.9380000000000024, 'underlyingPrice': 17573.58, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 2518.0, 'riskFreeRatePercent': 0.9779999999999944, 'underlyingPrice': 17297.94, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1854.0, 'riskFreeRatePercent': 1.1654999999999944, 'underlyingPrice': 18204.68, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1666.0, 'riskFreeRatePercent': 1.1700000000000017, 'underlyingPrice': 18597.23, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1551.0, 'riskFreeRatePercent': 1.159000000000006, 'underlyingPrice': 18736.44, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1547.0, 'riskFreeRatePercent': 1.117999999999995, 'underlyingPrice': 18675.35, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1131.0, 'riskFreeRatePercent': 1.1460000000000008, 'underlyingPrice': 19518.29, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1108.0, 'riskFreeRatePercent': 1.2554999999999978, 'underlyingPrice': 19441.18, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1405.0, 'riskFreeRatePercent': 1.0949999999999989, 'underlyingPrice': 18814.82, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1085.0, 'riskFreeRatePercent': 1.073999999999998, 'underlyingPrice': 19450.23, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 897.0, 'riskFreeRatePercent': 1.0420000000000016, 'underlyingPrice': 19900.87, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1124.0, 'riskFreeRatePercent': 1.0829999999999984, 'underlyingPrice': 19463.63, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1014.0, 'riskFreeRatePercent': 1.1585000000000036, 'underlyingPrice': 19596.2, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 932.0, 'riskFreeRatePercent': 1.1099999999999994, 'underlyingPrice': 19673.45, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1061.0, 'riskFreeRatePercent': 1.078000000000003, 'underlyingPrice': 19368.59, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1056.0, 'riskFreeRatePercent': 0.9909999999999997, 'underlyingPrice': 19450.67, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1018.0, 'riskFreeRatePercent': 0.9909999999999997, 'underlyingPrice': 19352.81, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1209.0, 'riskFreeRatePercent': 0.9334999999999951, 'underlyingPrice': 19094.8, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 1149.0, 'riskFreeRatePercent': 0.7944999999999993, 'underlyingPrice': 19160.49, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 864.0, 'riskFreeRatePercent': 0.8100000000000023, 'underlyingPrice': 19679.22, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 934.0, 'riskFreeRatePercent': 0.6465000000000032, 'underlyingPrice': 19593.06, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 934.0, 'riskFreeRatePercent': 0.6465000000000032, 'underlyingPrice': 19593.06, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 934.0, 'riskFreeRatePercent': 0.6465000000000032, 'underlyingPrice': 19593.06, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 714.0, 'riskFreeRatePercent': 0.6564999999999941, 'underlyingPrice': 19898.91, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 836.0, 'riskFreeRatePercent': 0.6659999999999968, 'underlyingPrice': 19741.14, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 807.0, 'riskFreeRatePercent': 0.6779999999999973, 'underlyingPrice': 19781.41, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 807.0, 'riskFreeRatePercent': 0.6779999999999973, 'underlyingPrice': 19781.41, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 674.0, 'riskFreeRatePercent': 0.7604999999999933, 'underlyingPrice': 20145.29, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 461.0, 'riskFreeRatePercent': 0.7254999999999967, 'underlyingPrice': 20793.11, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 382.0, 'riskFreeRatePercent': 0.751499999999993, 'underlyingPrice': 21052.17, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 389.0, 'riskFreeRatePercent': 0.742999999999995, 'underlyingPrice': 20991.64, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 281.0, 'riskFreeRatePercent': 0.7390000000000043, 'underlyingPrice': 21388.34, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 253.0, 'riskFreeRatePercent': 0.7745000000000033, 'underlyingPrice': 21331.46, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 203.0, 'riskFreeRatePercent': 0.7535000000000025, 'underlyingPrice': 21436.05, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 191.0, 'riskFreeRatePercent': 0.742999999999995, 'underlyingPrice': 21514.1, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 130.0, 'riskFreeRatePercent': 0.7180000000000035, 'underlyingPrice': 21738.66, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 127.0, 'riskFreeRatePercent': 0.7169999999999987, 'underlyingPrice': 21746.72, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 127.0, 'riskFreeRatePercent': 0.7295000000000016, 'underlyingPrice': 21577.64, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 103.0, 'riskFreeRatePercent': 0.707499999999996, 'underlyingPrice': 21678.0, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 106.0, 'riskFreeRatePercent': 0.6640000000000015, 'underlyingPrice': 21650.98, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 74.0, 'riskFreeRatePercent': 0.5955000000000013, 'underlyingPrice': 22044.65, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 74.0, 'riskFreeRatePercent': 0.5955000000000013, 'underlyingPrice': 22044.65, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 74.0, 'riskFreeRatePercent': 0.5955000000000013, 'underlyingPrice': 22044.65, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 74.0, 'riskFreeRatePercent': 0.5955000000000013, 'underlyingPrice': 22044.65, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 40.0, 'riskFreeRatePercent': 0.5769999999999982, 'underlyingPrice': 22566.78, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 28.0, 'riskFreeRatePercent': 0.5319999999999965, 'underlyingPrice': 22688.9, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 60.0, 'riskFreeRatePercent': 0.5229999999999961, 'underlyingPrice': 22069.73, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 66.0, 'riskFreeRatePercent': 0.5745000000000005, 'underlyingPrice': 21842.33, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}, {'instrumentType': 'Option', 'instrumentDefinition': {'buySell': 'Buy', 'underlyingType': 'Eti', 'instrumentCode': 'HSI19800N3.HF^B23', 'strike': 1980.0}, 'pricingParameters': {'marketValueInDealCcy': 47.0, 'riskFreeRatePercent': 0.5925000000000011, 'underlyingPrice': 22072.18, 'pricingModelType': 'BlackScholes', 'dividendType': 'ImpliedYield', 'volatilityType': 'Implied', 'underlyingTimeStamp': 'Default', 'reportCcy': 'HKD'}}]}
Batch of 100 requests no. 1/1 ended
response8df
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 1 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 2 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 3 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 4 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 57 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 58 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 59 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 60 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
61 rows × 14 columns
We are now going to look into using PEP 3107 (and PEP 484) (and some decorators). In line with PEP, I will also now use PEP8 naming conventions.
import nb_mypy # !pip3 install nb_mypy --trusted-host pypi.org # https://pypi.org/project/nb-mypy/ # https://gitlab.tue.nl/jupyter-projects/nb_mypy/-/blob/master/Nb_Mypy.ipynb
%load_ext nb_mypy
Version 1.0.4
%reload_ext nb_mypy
Version 1.0.4
%nb_mypy On
%nb_mypy DebugOff
# %nb_mypy unknown
from datetime import date as dtdate
import pandas_market_calendars as mcal # See `https://github.com/rsheftel/pandas_market_calendars/blob/master/examples/usage.ipynb` for info on this market calendar library
from typing import Tuple, Union, Dict, List, Any
import numpy as np
import calendar
from __future__ import annotations # This native library allows us to use not-yet-fully-deffined classes as a Type Hint when inside that class.
import refinitiv.data as rd # This is LSEG's Data and Analytics' API wrapper, called the Refinitiv Data Library for Python.
from refinitiv.data.content import historical_pricing # We will use this Python Class in `rd` to show the Implied Volatility data already available before our work.
from refinitiv.data.content import search # We will use this Python Class in `rd` to fid the instrument we are after, closest to At The Money.
from refinitiv.data.content.ipa.financial_contracts import option # We're going to need thtis to use the content layer of the RD library and the calculators of greeks and Impl Volat in IPA & ETI
import refinitiv.data.content.ipa.financial_contracts as rdf # We're going to need thtis to use the content layer of the RD library and the calculators of greeks and Impl Volat in Instrument Pricing Analytics (IPA) and Exchange Traded Instruments (ETI)
import numpy as np # We need `numpy` for mathematical and array manipilations.
import pandas as pd # We need `pandas` for datafame and array manipilations.
import calendar # We use `calendar` to identify holidays and maturity dates of intruments of interest.
import pytz # We use `pytz` to manipulate time values aiding `calendar` library. to import its types, you might need to run `!python3 -m pip install types-pytz`
import pandas_market_calendars as mcal # Used to identify holidays. See `https://github.com/rsheftel/pandas_market_calendars/blob/master/examples/usage.ipynb` for info on this market calendar library
from datetime import datetime, timedelta, timezone # We use these to manipulate time values
from dateutil.relativedelta import relativedelta # We use `relativedelta` to manipulate time values aiding `calendar` library.
# `plotly` is a library used to render interactive graphs:
import plotly
import plotly.graph_objects as go
import plotly.express as px # This is just to see the implied vol graph when that field is available
import matplotlib.pyplot as plt # We use `matplotlib` to just in case users do not have an environment suited to `plotly`.
from IPython.display import display, clear_output # We use `clear_output` for users who wish to loop graph production on a regular basis.
# Let's authenticate ourseves to LSEG's Data and Analytics service, Refinitiv:
try: # The following libraries are not available in Codebook, thus this try loop
rd.open_session(config_name="C:\\Example.DataLibrary.Python-main\\Example.DataLibrary.Python-main\\Configuration\\refinitiv-data.config.json")
rd.open_session("desktop.workspace")
except:
rd.open_session()
print(f"Here we are using the refinitiv Data Library version {rd.__version__}")
Here we are using the refinitiv Data Library version 1.0.0b24
class index_imp_vola_and_greeks_IPA_calc(): # All about Type Hints here: https://realpython.com/python-type-checking/#static-type-checking
def __init__( # Constroctor
self,
index_underlying: str = ".STOXX50E"
):
self.index_underlying: str = index_underlying
# self.expiryYearOfInterest: int = datetime.now().year
# self.graphStyle: str = 'without out of trading hours' # 'overlay', '3 graphs', 'simple'
# self.graphTemplate: str = 'plotly_dark'
# self.debug: bool = False
# self.returnDfGraph: bool = False
# # def change_attrs(self, **kwargs): for kwarg in kwargs: self.__setattr__(kwarg, kwargs[kwarg])
def get_exp_dates(
self,
year: int = datetime.now().year,
days: bool = True,
mcal_get_calendar: str = 'EUREX'
) -> Dict[int, Union[dtdate, str]]:
'''
get_exp_dates Version 4.0:
This function gets expiration dates for a year for NDX options, which are the 3rd Fridays of each month.
Changes
----------------------------------------------
Changed from Version 1.0 to 2.0: Jonathan Legrand changed Haykaz Aramyan's original code to allow
(i) for the function's holiday argument to be changed, and defaulted to 'EUREX' as opposed to 'CBOE_Index_Options' and
(ii) for the function to output full date objects as opposed to just days of the month if agument days=True.
Changed from Version 2.0 to 3.0: Jonathan Legrand changed this function to reflec the fact that it can be used for indexes other than EUREX.
Changed from Version 3.0 to 4.0: Jonathan Legrand changed this function to be in line with PEP 3107 (type hints).
Dependencies
----------------------------------------------
Python library 'pandas_market_calendars' version 3.2
Parameters
-----------------------------------------------
Input:
year(int): year for which expiration days are requested
mcal_get_calendar(str): String of the calendar for which holidays have to be taken into account. More on this calendar (link to Github checked 2022-10-11): https://github.com/rsheftel/pandas_market_calendars/blob/177e7922c7df5ad249b0d066b5c9e730a3ee8596/pandas_market_calendars/exchange_calendar_cboe.py
Default: mcal_get_calendar='EUREX'
days(bool): If True, only days of the month is outputed, else it's dataeime objects
Default: days=True
Output
-----------------------------------------------
dates(dict): dictionary of expiration days for each month of a specified year in datetime.date format.
'''
i: int # this is for the 'for loop' in this function coming below
# get CBOE market holidays
Cal: mcal.get_calendar = mcal.get_calendar(mcal_get_calendar)
holidays: Tuple[np.datetime64, ...] = Cal.holidays().holidays
# set calendar starting from Saturday
c: calendar.Calendar = calendar.Calendar(firstweekday=calendar.SATURDAY)
# get the 3rd Friday of each month
exp_dates: dict = {} # https://stackoverflow.com/questions/48054521/indicating-multiple-value-in-a-dict-for-type-hints
date: dtdate
for i in range(1, 13):
date = c.monthdatescalendar(year, i)[2][-1]
# check if found date is an holiday and get the previous date if it is
if date in holidays:
date = date + timedelta(-1)
# append the date to the dictionary
if year in exp_dates and days:
exp_dates[year].append(date.day)
elif year in exp_dates:
exp_dates[year].append(date)
elif days:
exp_dates[year] = [date.day]
else:
exp_dates[year] = [date]
return exp_dates
def search_index_opt_ATM(
self,
debug: bool = False,
after: int = 15,
call_or_put: str = 'Put',
searchFields: List[str] = ["ExchangeCode", "UnderlyingQuoteName"],
include_weekly_opts: bool = False,
topNuSearchResults: int = 10_000,
timeOfCalcDatetime: datetime = datetime.now(), # Here we allow for historical analysis.
underMrktPriceField: str = "TRDPRC_1"
) -> index_imp_vola_and_greeks_IPA_calc:
self.after = after
self.timeOfCalcDatetime = timeOfCalcDatetime
self.underMrktPriceField = underMrktPriceField
i: int; j: dtdate; k: str # this is for the 'for loop' in this function coming below
self.exchangeC: str; self.exchangeRIC: str; self.mcalGetCalendar: str
if self.index_underlying == ".STOXX50E":
self.exchangeC, self.exchangeRIC, self.mcalGetCalendar = 'EUX', 'STX', 'EUREX'
elif self.index_underlying == '.SPX':
self.exchangeC, self.exchangeRIC, self.mcalGetCalendar = 'OPQ', 'SPX', 'CBOE_Futures' # 'CBOE_Index_Options' # should be 'CBOE_Index_Options'... CBOT_Equity
timeOfCalcStr: str=timeOfCalcDatetime.strftime('%Y-%m-%d')
fullDatesAtTimeOfCalc: dict = self.get_exp_dates(
year=timeOfCalcDatetime.year,
days=False,
mcal_get_calendar=self.mcalGetCalendar)
fullDatesAtTimeOfCalcDatetime: List[datetime] = [
datetime(j.year, j.month, j.day)
for j in fullDatesAtTimeOfCalc[
list(fullDatesAtTimeOfCalc.keys())[0]]]
expiryDateOfInt: datetime = [
j for j in fullDatesAtTimeOfCalcDatetime
if j > timeOfCalcDatetime + relativedelta(days=self.after)][0]
if debug: print(f"expiryDateOfInt: {expiryDateOfInt}")
# Certain search fields are nessesary for the next steps, so let's add them to the `searchFields` object:
for k in ['DocumentTitle', 'RIC', 'StrikePrice', 'UnderlyingQuoteRIC'][::-1]: # the `[::-1]` reverses the list
searchFields.insert(0, k)
# Now let's build our Search filter:
_filter: str = f"RCSAssetCategoryLeaf eq 'Option' \
and RIC eq '{self.exchangeRIC}*' \
and CallPutOption eq '{call_or_put}' \
and ExchangeCode eq '{self.exchangeC}' \
and ExpiryDate ge {(expiryDateOfInt - relativedelta(days=1)).strftime('%Y-%m-%d')} \
and ExpiryDate lt {(expiryDateOfInt + relativedelta(days=1)).strftime('%Y-%m-%d')}"
if not include_weekly_opts:
_filter += " and DocumentTitle ne '*Weekly*'"
response1 = search.Definition(
view=search.Views.SEARCH_ALL, # To see what views are available: `help(search.Views)` & `search.metadata.Definition(view = search.Views.SEARCH_ALL).get_data().data.df.to_excel("SEARCH_ALL.xlsx")`
query=self.index_underlying,
select=', '.join(map(str, searchFields)),
filter=_filter, # ge (greater than or equal to), gt (greater than), lt (less than) and le (less than or equal to). These can only be applied to numeric and date properties.
top=topNuSearchResults,
).get_data()
self.searchDf: pd.DataFrame = response1.data.df
searchDf: pd.DataFrame = self.searchDf
if debug:
print("searchDf")
display(searchDf)
try:
self.underlyingPrice: str = rd.get_history(
universe=[self.index_underlying],
fields=[underMrktPriceField],
interval="tick").iloc[-1][0]
except:
print("Function failed at the search strage, returning the following dataframe: ")
display(searchDf)
if debug:
print(f"Underlying {self.index_underlying}'s price recorded here was {self.underlyingPrice}")
display(searchDf.iloc[(searchDf.StrikePrice-self.underlyingPrice).abs().argsort()[:10]])
self.instrument: str = searchDf.iloc[(
searchDf.StrikePrice-self.underlyingPrice).abs().argsort()[:1]].RIC.values[0]
self.instrumentInfo: pd.DataFrame = searchDf.iloc[(
searchDf.StrikePrice-self.underlyingPrice).abs().argsort()[:1]]
self.ATMOpt = self.instrument
return self
def IPA_calc(
self,
dateBack: int = 3,
optnMrktPriceField: str = "TRDPRC_1",
debug: bool = False,
atOptionTradeOnly: bool = True,
riskFreeRatePrct: Union[str, None] = None,
riskFreeRatePrctField: Union[str, None] = None,
timeZoneInGraph: datetime = datetime.now().astimezone(),
requestFields: List[str] = [
"DeltaPercent", "GammaPercent", "RhoPercent",
"ThetaPercent", "VegaPercent"],
searchBatchMax: int = 100
) -> index_imp_vola_and_greeks_IPA_calc:
i: int # Type Hinted for loops coming up below.
k: str # Type Hinted for loops coming up below.
n: int # Type Hinted for loops coming up below.
m: int # Type Hinted for loops coming up below.
p: rdf._base_definition.BaseDefinition # Type Hinted for loops coming up below.
self.dateBack: int = dateBack
self.start: dtdate = self.timeOfCalcDatetime - pd.tseries.offsets.BDay(
self.dateBack)
self.startStr: str = (self.timeOfCalcDatetime - pd.tseries.offsets.BDay(
self.dateBack)).strftime('%Y-%m-%dT%H:%M:%S.%f') # e.g.: '2022-10-05T07:30:00.000'
self.endStr: str = self.timeOfCalcDatetime.strftime('%Y-%m-%dT%H:%M:%S.%f')
_optnMrktPrice: pd.DataFrame = rd.get_history(
universe=[self.instrument],
fields=[optnMrktPriceField],
interval="10min",
start=self.startStr, # Ought to always start at 4 am for OPRA exchanged Options, more info in the article below
end=self.endStr) # Ought to always end at 8 pm for OPRA exchanged Options, more info in the article below
if _optnMrktPrice.empty:
print(f"No data could be found for {self.instrument}, please check it on Refinitiv Workspace")
if debug:
print(self.instrument)
display(_optnMrktPrice)
# get a datapoint every 10 min
optnMrktPrice: pd.DataFrame = _optnMrktPrice.resample(
'10Min').mean()
# Only keep trading days
self.optnMrktPrice: pd.DataFrame = optnMrktPrice[
optnMrktPrice.index.strftime('%Y-%m-%d').isin(
[k for k in _optnMrktPrice.index.strftime('%Y-%m-%d').unique()])]
# Forward Fill to populate NaN values
self.optnMrktPrice.fillna(method='ffill', inplace=True)
# Note also that one may want to only look at 'At Option Trade' datapoints,
# i.e.: Implied Volatility when a trade is made for the Option, but not when
# none is made. For this, we will use the 'At Trade' (`AT`) dataframes:
if atOptionTradeOnly:
self.AToptnMrktPrice: pd.DataFrame = _optnMrktPrice
self.underlying: str = self.searchDf.iloc[
(self.searchDf.StrikePrice).abs().argsort()[
:1]].UnderlyingQuoteRIC.values[0][0]
_underlyingMrktPrice: pd.DataFrame = rd.get_history(
universe=[self.underlying],
fields=[self.underMrktPriceField],
interval="10min",
start=self.startStr,
end=self.endStr)
# Let's put it al in one data-frame, `df`. Some datasets will have data
# going from the time we set for `startStr` all the way to `endStr`. Some won't
# because no trade happened in the past few minutes/hours. We ought to base
# ourselves on the dataset with values getting closer to `end` and `ffill`
# for the other column. As a result, the following `if` loop is needed:
if optnMrktPrice.index[-1] >= _underlyingMrktPrice.index[-1]:
df: pd.DataFrame = self.optnMrktPrice.copy()
df[f"underlying {self.underlying} {self.underMrktPriceField}"] = _underlyingMrktPrice
else:
df = _underlyingMrktPrice.copy()
df.rename(
columns={self.underMrktPriceField:
f"underlying {self.underlying} {self.underMrktPriceField}"},
inplace=True)
df[self.underMrktPriceField] = self.optnMrktPrice
df.columns.name = self.optnMrktPrice.columns.name
df.fillna(method='ffill', inplace=True) # Forward Fill to populate NaN values
selfdf: pd.DataFrame = df.dropna()
if atOptionTradeOnly:
ATunderlyingMrktPrice: pd.DataFrame = self.AToptnMrktPrice.join(
_underlyingMrktPrice,
rsuffix=f"_{self.underlying}_underlying",
lsuffix=f"_{self.instrument}_OptPr",
how='inner')
self.strikePrice: pd.DataFrame = self.searchDf.iloc[
(self.searchDf['StrikePrice']-self.underlyingPrice).abs().argsort()[
:1]].StrikePrice.values[0]
# I didn't think that I needed to Type Hint for the event when
# `_riskFreeRatePrct` & `_riskFreeRatePrctField` were `None`, but Error Messages
# suggest otherwise...
_riskFreeRatePrct: Union[str, None]
_riskFreeRatePrctField: Union[str, None]
if riskFreeRatePrct is None and self.index_underlying == ".SPX":
_riskFreeRatePrct, _riskFreeRatePrctField = 'USDCFCFCTSA3M=', 'TR.FIXINGVALUE'
elif riskFreeRatePrct is None and self.index_underlying == ".STOXX50E":
_riskFreeRatePrct, _riskFreeRatePrctField = 'EURIBOR3MD=', 'TR.FIXINGVALUE'
elif riskFreeRatePrct is not None:
_riskFreeRatePrct, _riskFreeRatePrctField = riskFreePrctRate, riskFreeRatePrctField
self.riskFreeRatePrct: Union[str, None] = riskFreeRatePrct
self.riskFreeRatePrctField: Union[str, None] = riskFreeRatePrctField
_RfRatePrct: pd.DataFrame = rd.get_history(
universe=[_riskFreeRatePrct], # USD3MFSR=, USDSOFR=
fields=[_riskFreeRatePrctField],
# Since we will use `dropna()` as a way to select the rows we are after later on in the code, we need to ask for more risk-free data than needed, just in case we don't have enough:
start=(self.start - timedelta(days=1)).strftime('%Y-%m-%d'), # https://teamtreehouse.com/community/local-variable-datetime-referenced-before-assignment
end=(self.timeOfCalcDatetime +
timedelta(days=1)).strftime('%Y-%m-%d'))
self.RfRatePrct: pd.DataFrame = _RfRatePrct.resample(
'10Min').mean().fillna(method='ffill')
df['RfRatePrct'] = self.RfRatePrct
self.df: pd.DataFrame = df.fillna(method='ffill')
if atOptionTradeOnly:
pd.options.mode.chained_assignment = None # default='warn'
ATunderlyingMrktPrice['RfRatePrct'] = [
pd.NA for i in ATunderlyingMrktPrice.index]
for i in self.RfRatePrct.index:
_i: str = str(i)[:10]
for n, m in enumerate(ATunderlyingMrktPrice.index):
if _i in str(m):
if len(self.RfRatePrct.loc[i].values) == 2:
ATunderlyingMrktPrice[
'RfRatePrct'].iloc[n] = self.RfRatePrct.loc[i].values[0][0]
elif len(self.RfRatePrct.loc[i].values) == 1:
ATunderlyingMrktPrice[
'RfRatePrct'].iloc[n] = self.RfRatePrct.loc[i].values[0]
self.ATdf: pd.DataFrame = ATunderlyingMrktPrice.copy().fillna(method='ffill') # This is in case there were no Risk Free datapoints released after a certain time, but trades on the option still went through.
if timeZoneInGraph != 'GMT':
if atOptionTradeOnly:
self.ATdf.index = [
self.ATdf.index[i].replace(
tzinfo=pytz.timezone(
'GMT')).astimezone(
tz=datetime.now().astimezone().tzinfo)
for i in range(len(self.ATdf))]
else:
df.index = [
df.index[i].replace(
tzinfo=pytz.timezone(
'GMT')).astimezone(
tz=timeZoneInGraph.tzinfo)
for i in range(len(df))]
# Define our message to the calculation endpoint in the RDP (Refinitiv Data Platform) API, `atOptionTradeOnly`:
self.universeL: List[rdf._base_definition.BaseDefinition]
if atOptionTradeOnly:
self.universeL = [
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code=self.instrument,
strike=float(self.strikePrice),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(
self.ATdf[
f"{optnMrktPriceField}_{self.instrument}_OptPr"][i]),
risk_free_rate_percent=float(self.ATdf['RfRatePrct'][i]),
underlying_price=float(
self.ATdf[
f"{self.underMrktPriceField}_{self.underlying}_underlying"][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='EUR'))
for i in range(len(self.ATdf.index))]
else:
self.universeL = [
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code=self.instrument,
strike=float(self.strikePrice),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(df[optnMrktPriceField][i]),
risk_free_rate_percent=float(df.RfRatePrct[i]),
underlying_price=float(
df[f"underlying {self.underlying} {self.underMrktPriceField}"][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='EUR'))
for i in range(len(df.index))]
# We would like to keep a minimum of these fields in the Search Responce in order to construct following graphs:
for k in ["MarketValueInDealCcy", "RiskFreeRatePercent",
"UnderlyingPrice", "Volatility"][::-1]:
requestFields.insert(0, k)
self.requestFields: List[str] = requestFields
for i, p in enumerate(
[self.universeL[i:i+searchBatchMax]
for i in range(0, len(self.universeL), searchBatchMax)]): # This list chunks our `universeL` in batches of `searchBatchMax`
_IPADf: pd.DataFrame = rdf.Definitions(
universe=p, fields=requestFields).get_data().data.df
if i == 0:
self.IPADf: pd.DataFrame = _IPADf
else:
self.IPADf: pd.DataFrame = self.IPADf.append(
_IPADf, ignore_index=True)
if atOptionTradeOnly:
self.IPADf.index = self.ATdf.index
else:
self.IPADf.index = self.df.index
self.atOptionTradeOnly: bool = atOptionTradeOnly
return self
def Simple_graph(
self,
maxColwidth: int = 200,
size: Tuple[int, int] = (15, 5),
lineStyle: str = '.-', # 'o-'
plotting: str = 'Volatility',
displayIndexInfo: bool = False
) -> index_imp_vola_and_greeks_IPA_calc:
# We are going to want to show details about data retreived in a dataframe in the output of this function. The one line below allows us to maximise the width (column) length of cells to see all that is written within them.
if displayIndexInfo:
pd.options.display.max_colwidth = maxColwidth
display(self.instrumentInfo)
IPADfSimpleGraph: pd.DataFrame = pd.DataFrame(
data=self.IPADf[[plotting]].values,
index=self.IPADf[[plotting]].index)
fig, axes = plt.subplots(ncols=1, figsize=size)
axes.plot(IPADfSimpleGraph, lineStyle)
if self.atOptionTradeOnly:
axes.set_title(f"{self.instrument} {plotting} At Trade Only")
else:
axes.set_title(f"{self.instrument} {plotting}")
self.plt = plt
return self
def Graph(
self,
include: Union[None, List[str]] = None,
graphTemplate: str = 'plotly_dark',
debug: bool=False
) -> index_imp_vola_and_greeks_IPA_calc:
if include is None:
include = self.requestFields
self.IPADfGraph = self.IPADf[include]
if debug: display(self.IPADfGraph)
self.fig = px.line(self.IPADfGraph)
# # Seems like the below (comented out) is resolved. Leaving it for future debugging if needed.
# try: # This is needed in case there is not enough data to calculate values for all timestamps , see https://stackoverflow.com/questions/67244912/wide-format-csv-with-plotly-express
# self.IPADfGraph = self.IPADf[include]
# if debug: display(self.IPADfGraph)
# self.fig = px.line(self.IPADfGraph)
# except:
# try:
# print(f"Not all fields could be graphed: {include}")
# self.IPADfGraph = self.IPADfGraph[
# ["Volatility", "MarketValueInDealCcy",
# "RiskFreeRatePercent", "UnderlyingPrice"]]
# self.fig = px.line(self.IPADfGraph)
# except:
# print(f"Not all fields could be graphed: ['Volatility', 'MarketValueInDealCcy', 'RiskFreeRatePercent', 'UnderlyingPrice']")
# self.IPADfGraph = self.IPADfGraph[
# ["Volatility", "MarketValueInDealCcy",
# "RiskFreeRatePercent", "UnderlyingPrice"]]
# self.fig = px.line(self.IPADfGraph)
self.graphTemplate = graphTemplate
return self
def Overlay(
self
) -> index_imp_vola_and_greeks_IPA_calc:
self.fig.update_layout(
title=self.instrument,
template=self.graphTemplate)
self.fig.for_each_trace(
lambda t: t.update(
visible=True if t.name in self.IPADfGraph.columns[:1] else "legendonly"))
return self
def Stack3(
self,
autosize: bool = False,
width: int = 1300,
height: int = 500
) -> index_imp_vola_and_greeks_IPA_calc:
self.fig = plotly.subplots.make_subplots(rows=3, cols=1)
self.fig.add_trace(go.Scatter(
x=self.IPADf.index, y=self.IPADfGraph.Volatility,
name='Op Imp Volatility'), row=1, col=1)
self.fig.add_trace(go.Scatter(
x=self.IPADf.index, y=self.IPADfGraph.MarketValueInDealCcy,
name='Op Mk Pr'), row=2, col=1)
self.fig.add_trace(go.Scatter(
x=self.IPADf.index, y=self.IPADfGraph.UnderlyingPrice,
name=self.underlying+' Undrlyg Pr'), row=3, col=1)
self.fig.update(layout_xaxis_rangeslider_visible=False)
self.fig.update_layout(title=self.IPADfGraph.columns.name)
self.fig.update_layout(
title=self.instrument,
template=self.graphTemplate,
autosize=autosize,
width=width,
height=height)
return self
<cell>387: error: Attribute "IPADf" already defined on line 385 [no-redef]
index_imp_vola_and_greeks_IPA_calc Overlay Test¶from functools import wraps
from time import time
index_imp_vola_and_greeks_IPA_calc().search_index_opt_ATM().ATMOpt
'STXE41000P3.EX'
# index_imp_vola_and_greeks_IPA_calc().search_index_opt_ATM().IPA_calc().Graph(debug=True).Overlay().fig.show()
s = time()
t1 = index_imp_vola_and_greeks_IPA_calc()
print(time() - s)
0.0
s = time()
t2 = t1.search_index_opt_ATM()
print(time() - s)
4.971007823944092
t1.ATMOpt
'STXE41000P3.EX'
index_imp_vola_and_greeks_IPA_calc().search_index_opt_ATM().ATMOpt
'STXE41250P3.EX'
index_imp_vola_and_greeks_IPA_calc().search_index_opt_ATM().ATMOpt
'STXE41250P3.EX'
s = time()
t3 = t2.IPA_calc()
print(time() - s)
21.338878870010376
s = time()
t3 = t2.Graph(debug=True)
print(time() - s)
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|
| 2023-03-14 09:10:00+01:00 | 110.0 | 2.753 | 4100.3 | 21.236716 | -0.488709 | 0.001453 | -2.084896 | -1.539653 | 5.118502 |
| 2023-03-14 09:40:00+01:00 | 106.5 | 2.753 | 4114.63 | 21.891511 | -0.468043 | 0.001401 | -2.004483 | -1.586342 | 5.122988 |
| 2023-03-14 09:50:00+01:00 | 106.3 | 2.753 | 4117.22 | 22.088216 | -0.464473 | 0.001387 | -1.990986 | -1.599943 | 5.122486 |
| 2023-03-14 10:00:00+01:00 | 109.4 | 2.753 | 4106.97 | 21.749656 | -0.478848 | 0.001416 | -2.047576 | -1.576564 | 5.122176 |
| 2023-03-14 10:40:00+01:00 | 110.5 | 2.753 | 4105.54 | 21.830447 | -0.480837 | 0.001411 | -2.056039 | -1.582192 | 5.121585 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-03-16 12:30:00+01:00 | 146.0 | 2.646 | 4068.51 | 25.086035 | -0.525523 | 0.001238 | -2.252807 | -1.802243 | 5.068409 |
| 2023-03-16 12:40:00+01:00 | 139.0 | 2.646 | 4081.85 | 25.066443 | -0.509055 | 0.001237 | -2.186519 | -1.809341 | 5.095103 |
| 2023-03-16 13:00:00+01:00 | 145.0 | 2.646 | 4064.63 | 24.483388 | -0.53187 | 0.001268 | -2.275254 | -1.756661 | 5.05738 |
| 2023-03-16 13:20:00+01:00 | 146.3 | 2.646 | 4075.46 | 25.857575 | -0.515421 | 0.0012 | -2.216098 | -1.861411 | 5.084274 |
| 2023-03-16 14:30:00+01:00 | 156.0 | 2.646 | 4052.57 | 25.377735 | -0.54435 | 0.001223 | -2.329659 | -1.808759 | 5.026452 |
76 rows × 9 columns
0.1254889965057373
s = time()
t4 = t3.Overlay().fig.show()
print(time() - s)
0.0411677360534668
index_imp_vola_and_greeks_IPA_calc Simple Graph Test¶index_imp_vola_and_greeks_IPA_calc().search_index_opt_ATM().IPA_calc().Graph(debug=True).Stack3().fig.show()
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|
| 2023-03-13 16:50:00+01:00 | 104.3 | 2.957 | 4115.42 | 24.080919 | -0.432544 | 0.00126 | -1.858586 | -1.69381 | 5.069878 |
| 2023-03-14 10:50:00+01:00 | 104.2 | 2.753 | 4101.77 | 22.801912 | -0.449826 | 0.001344 | -1.922581 | -1.619456 | 5.085256 |
| 2023-03-14 14:30:00+01:00 | 71.0 | 2.753 | 4177.94 | 22.234661 | -0.348144 | 0.001265 | -1.504628 | -1.502993 | 4.841338 |
| 2023-03-14 16:10:00+01:00 | 66.1 | 2.753 | 4183.59 | 21.6196 | -0.337516 | 0.001284 | -1.457882 | -1.446727 | 4.791758 |
| 2023-03-14 16:20:00+01:00 | 65.6 | 2.753 | 4186.8 | 21.740407 | -0.334132 | 0.001271 | -1.444482 | -1.450137 | 4.776695 |
| 2023-03-15 10:10:00+01:00 | 83.6 | 2.815 | 4138.78 | 21.851357 | -0.397499 | 0.001355 | -1.705079 | -1.524007 | 5.002307 |
| 2023-03-15 10:30:00+01:00 | 95.3 | 2.815 | 4114.14 | 22.145603 | -0.431844 | 0.00137 | -1.846323 | -1.564368 | 5.066786 |
| 2023-03-15 10:50:00+01:00 | 106.7 | 2.815 | 4099.31 | 23.097975 | -0.453025 | 0.001329 | -1.936888 | -1.637633 | 5.087092 |
| 2023-03-15 11:10:00+01:00 | 117.1 | 2.815 | 4083.15 | 23.668493 | -0.47447 | 0.001308 | -2.02629 | -1.679064 | 5.091099 |
| 2023-03-15 11:20:00+01:00 | 124.9 | 2.815 | 4054.58 | 22.425543 | -0.514369 | 0.001392 | -2.180171 | -1.582211 | 5.06069 |
| 2023-03-15 11:30:00+01:00 | 135.8 | 2.815 | 4058.52 | 24.973756 | -0.504807 | 0.001249 | -2.154644 | -1.763101 | 5.069006 |
| 2023-03-15 11:40:00+01:00 | 133.0 | 2.815 | 4031.94 | 21.644943 | -0.548569 | 0.00144 | -2.312677 | -1.508364 | 4.996483 |
| 2023-03-15 11:50:00+01:00 | 149.8 | 2.815 | 4032.16 | 25.025447 | -0.537643 | 0.001249 | -2.285916 | -1.746851 | 5.012305 |
| 2023-03-15 12:00:00+01:00 | 134.9 | 2.815 | 4048.0 | 23.730194 | -0.520573 | 0.001316 | -2.211464 | -1.669365 | 5.048714 |
| 2023-03-15 12:10:00+01:00 | 138.2 | 2.815 | 4052.73 | 24.866305 | -0.512238 | 0.001256 | -2.18383 | -1.752234 | 5.059391 |
| 2023-03-15 12:20:00+01:00 | 138.8 | 2.815 | 4046.17 | 24.313957 | -0.521676 | 0.001285 | -2.218772 | -1.708988 | 5.045634 |
| 2023-03-15 12:40:00+01:00 | 135.6 | 2.815 | 4047.99 | 23.86781 | -0.520288 | 0.001309 | -2.21101 | -1.679015 | 5.048902 |
| 2023-03-15 13:00:00+01:00 | 139.5 | 2.815 | 4042.16 | 24.035525 | -0.527506 | 0.0013 | -2.240644 | -1.686291 | 5.035775 |
| 2023-03-15 13:20:00+01:00 | 132.7 | 2.815 | 4047.12 | 23.203357 | -0.522922 | 0.001346 | -2.218221 | -1.631741 | 5.04587 |
| 2023-03-15 13:30:00+01:00 | 138.8 | 2.815 | 4036.95 | 23.345265 | -0.536205 | 0.001338 | -2.271881 | -1.6332 | 5.020024 |
| 2023-03-15 13:40:00+01:00 | 146.0 | 2.815 | 4033.34 | 24.393987 | -0.537937 | 0.001281 | -2.28396 | -1.703567 | 5.013404 |
| 2023-03-15 14:00:00+01:00 | 145.5 | 2.815 | 4037.11 | 24.696302 | -0.532311 | 0.001266 | -2.263066 | -1.728365 | 5.024668 |
| 2023-03-15 14:10:00+01:00 | 144.2 | 2.815 | 4037.97 | 24.528659 | -0.531655 | 0.001274 | -2.259622 | -1.717352 | 5.026439 |
| 2023-03-15 14:30:00+01:00 | 141.2 | 2.815 | 4050.48 | 25.231128 | -0.514373 | 0.001238 | -2.194182 | -1.776359 | 5.055571 |
| 2023-03-15 14:40:00+01:00 | 129.7 | 2.815 | 4058.07 | 23.725309 | -0.507331 | 0.001315 | -2.158506 | -1.675358 | 5.067832 |
| 2023-03-15 15:00:00+01:00 | 142.9 | 2.815 | 4061.81 | 26.699094 | -0.49847 | 0.001168 | -2.137899 | -1.885611 | 5.073753 |
| 2023-03-15 15:10:00+01:00 | 131.1 | 2.815 | 4059.65 | 24.159177 | -0.504595 | 0.001291 | -2.149723 | -1.706595 | 5.070459 |
| 2023-03-15 15:20:00+01:00 | 130.3 | 2.815 | 4058.34 | 23.870707 | -0.506743 | 0.001307 | -2.15688 | -1.685695 | 5.06833 |
| 2023-03-15 15:30:00+01:00 | 135.3 | 2.815 | 4048.92 | 23.904098 | -0.518994 | 0.001307 | -2.206025 | -1.682223 | 5.050936 |
| 2023-03-15 15:40:00+01:00 | 141.4 | 2.815 | 4040.46 | 24.234535 | -0.529214 | 0.00129 | -2.248438 | -1.698849 | 5.032029 |
| 2023-03-15 15:50:00+01:00 | 144.4 | 2.815 | 4056.21 | 26.440594 | -0.505403 | 0.001181 | -2.164359 | -1.86462 | 5.065995 |
| 2023-03-15 16:10:00+01:00 | 138.3 | 2.815 | 4070.16 | 26.604919 | -0.488813 | 0.001169 | -2.098699 | -1.882361 | 5.082676 |
| 2023-03-15 16:20:00+01:00 | 134.3 | 2.815 | 4062.26 | 25.048471 | -0.500038 | 0.001245 | -2.13592 | -1.770179 | 5.074277 |
| 2023-03-15 16:40:00+01:00 | 139.7 | 2.815 | 4067.62 | 26.635367 | -0.491754 | 0.001169 | -2.110653 | -1.883597 | 5.08028 |
| 2023-03-15 17:10:00+01:00 | 149.0 | 2.815 | 4041.09 | 25.810029 | -0.524676 | 0.001212 | -2.238176 | -1.810007 | 5.036936 |
| 2023-03-16 09:20:00+01:00 | 120.2 | 2.646 | 4084.07 | 24.297546 | -0.473989 | 0.001274 | -2.027841 | -1.732824 | 5.091869 |
| 2023-03-16 09:30:00+01:00 | 121.4 | 2.646 | 4087.56 | 24.85653 | -0.469508 | 0.001243 | -2.012591 | -1.772236 | 5.092349 |
| 2023-03-16 10:20:00+01:00 | 149.1 | 2.646 | 4056.55 | 27.331879 | -0.504446 | 0.001142 | -2.165337 | -1.937077 | 5.066616 |
| 2023-03-16 13:10:00+01:00 | 132.8 | 2.646 | 4070.91 | 25.527232 | -0.489669 | 0.001218 | -2.097074 | -1.816736 | 5.083867 |
| 2023-03-16 13:30:00+01:00 | 133.0 | 2.646 | 4068.9 | 25.372439 | -0.492274 | 0.001227 | -2.106754 | -1.805244 | 5.081987 |
index_imp_vola_and_greeks_IPA_calc().search_index_opt_ATM().IPA_calc().Simple_graph().plt.show()
[Error 400 - invalid_grant] empty error description
test1 = index_imp_vola_and_greeks_IPA_calc(index_underlying=".SPX")
test2 = test1.search_index_opt_ATM(
debug=False,
after=15,
call_or_put='Put',
searchFields=["ExchangeCode", "UnderlyingQuoteName"],
include_weekly_opts=False,
topNuSearchResults=10,
timeOfCalcDatetime=datetime.now(),
underMrktPriceField="TRDPRC_1")
test3 = test2.IPA_calc(
dateBack=3,
optnMrktPriceField="TRDPRC_1",
debug=False,
atOptionTradeOnly=True,
riskFreeRate=None,
riskFreeRateField=None,
timeZoneInGraph=datetime.now().astimezone())
test3.IPADf
test4 = test3.Simple_graph(
maxColwidth=200,
size=(15, 5),
lineStyle='.-', # 'o-'
plotting='Volatility'
)
test4.plt.show()
# rd.close_session() # It's good practice to close our RD session when done.
# # !pip3 install --trusted-host pypi.org dash_tvlwc
# # !pip3 install --trusted-host pypi.org dash
# import random
# import dash_tvlwc
# import dash
# from dash.dependencies import Input, Output, State
# from dash import html
# from datetime import datetime
# import random
# import pandas as pd
# def generate_random_series(*args, **kwargs):
# return generate_random_ohlc(*args, **kwargs, close_only=True)
# def generate_random_ohlc(v0: float, ret=0.05, n=500, t0='2021-01-01', close_only=False):
# datelist = [dt.strftime('%Y-%m-%d') for dt in pd.date_range(t0, periods=n).tolist()]
# res = []
# c = v0
# for dt in datelist:
# o = c
# c = o * (1 + random.uniform(-ret, ret))
# if not close_only:
# h = max(o, c) * (1 + random.uniform(0, ret))
# l = min(o, c) * (1 + random.uniform(-ret, 0))
# res.append({
# "time": dt,
# "open": o,
# "high": h,
# "low": l,
# "close": c
# })
# else:
# res.append({
# "time": dt,
# "value": c
# })
# o = c
# return res
# app = dash.Dash(__name__, external_stylesheets=['./assets/stylesheet.css'])
# chart_options = {
# 'layout': {
# 'background': {'type': 'solid', 'color': '#1B2631'},
# 'textColor': 'white',
# },
# 'grid': {
# 'vertLines': {'visible': False},
# 'horzLines': {'visible': False},
# },
# 'localization': {'locale': 'en-US'}
# }
# panel1 = [
# html.H2('Bar'),
# dash_tvlwc.Tvlwc(
# id='bar-chart',
# seriesData=[generate_random_ohlc(v0=100, n=50)],
# seriesTypes=['bar'],
# width='100%',
# chartOptions=chart_options
# )
# ]
# p2_series = generate_random_ohlc(v0=1, n=50, ret=0.1)
# p2_series = [{'time': v['time']} if 12 < idx < 20 or idx > 45 else v for idx, v in enumerate(p2_series)]
# panel2 = [
# html.H2('Candlestick'),
# dash_tvlwc.Tvlwc(
# id='candlestick-chart',
# seriesData=[p2_series],
# seriesTypes=['candlestick'],
# seriesOptions=[{
# 'downColor': '#a6269a',
# 'upColor': '#ffaa30',
# 'borderColor': 'black',
# 'wickColor': 'black'
# }],
# width='100%',
# chartOptions={'layout': {'background': {'type': 'solid', 'color': 'white'}}}
# )
# ]
# panel3 = [
# html.H2('Area'),
# dash_tvlwc.Tvlwc(
# id='area-chart',
# seriesData=[generate_random_series(v0=15, n=50)],
# seriesTypes=['area'],
# seriesOptions=[{
# 'lineColor': '#FFAA30',
# 'topColor': '#2962FF',
# 'bottomColor': 'rgba(180, 98, 200, 0.1)',
# 'priceLineWidth': 3,
# 'priceLineColor': 'red'
# }],
# width='100%',
# chartOptions=chart_options
# )
# ]
# p4_series = generate_random_series(v0=5000, n=50)
# p4_mean = sum([p['value'] for p in p4_series]) / 50
# p4_max = max([p['value'] for p in p4_series])
# price_lines = [{'price': p4_max, 'color': '#2962FF', 'lineStyle': 0, 'title': 'MAX PRICE', 'axisLabelVisible': True}]
# panel4 = [
# html.H2('Baseline'),
# dash_tvlwc.Tvlwc(
# id='baseline-chart',
# seriesData=[p4_series],
# seriesTypes=['baseline'],
# seriesOptions=[{
# 'baseValue': {'type': 'price', 'price': p4_mean},
# 'topFillColor1': 'black',
# 'topFillColor2': 'rgba(255,255,255,0)',
# 'topLineColor': 'black',
# 'crosshairMarkerRadius': 8,
# 'lineWidth': 5,
# 'priceScaleId': 'left'
# }],
# seriesPriceLines=[price_lines],
# width='100%',
# chartOptions={
# 'rightPriceScale': {'visible': False},
# 'leftPriceScale': {'visible': True, 'borderColor': 'rgba(197, 203, 206, 1)',},
# 'timeScale': {'visible': False},
# 'grid': {'vertLines': {'visible': False}, 'horzLines': {'style': 0, 'color': 'black'}},
# }
# )
# ]
# # add markers and add color to volume bar
# p5_series = generate_random_series(v0=1, n=50, ret=0.1)
# markers = [
# {'time': p5_series[15]['time'], 'position': 'aboveBar', 'color': '#f68410', 'shape': 'circle', 'text': 'Signal'},
# {'time': p5_series[20]['time'], 'position': 'belowBar', 'color': 'white', 'shape': 'arrowUp', 'text': 'Buy'}
# ]
# p5_series_volume = generate_random_series(v0=100, n=50, ret=0.05)
# for i in p5_series_volume:
# i['color'] = random.choice(['rgba(0, 150, 136, 0.8)', 'rgba(255,82,82, 0.8)'])
# panel5 = [
# html.H2('Line and volume'),
# dash_tvlwc.Tvlwc(
# id='line-chart',
# seriesData=[p5_series, p5_series_volume],
# seriesTypes=['line', 'histogram'],
# seriesOptions=[
# {
# 'lineWidth': 1
# },
# {
# 'color': '#26a69a',
# 'priceFormat': {'type': 'volume'},
# 'priceScaleId': '',
# 'scaleMargins': {'top': 0.9, 'bottom': 0},
# 'priceLineVisible': False
# },
# ],
# seriesMarkers=[markers],
# width='100%',
# chartOptions=chart_options
# )
# ]
# p6_series = generate_random_series(v0=100, n=50, ret=0.3)
# for idx, _ in enumerate(p6_series):
# if idx in [5,12,13,14,20,33,34,46]:
# p6_series[idx]['color'] = 'white'
# panel6 = [
# html.H2('Histogram'),
# dash_tvlwc.Tvlwc(
# id='histogram-chart',
# seriesData=[p6_series],
# seriesTypes=['histogram'],
# seriesOptions=[{
# 'color': '#ff80cc',
# 'base': 100,
# 'priceLineVisible': False,
# 'lastValueVisible': False
# }],
# width='100%',
# chartOptions={'layout': {'textColor': '#ff80cc', 'background': {'type': 'solid', 'color': 'black'}}}
# )
# ]
# app.layout = html.Div([
# html.H1('Chart options and series options'),
# html.Div(className='container', children=[
# html.Div(className='one', children=panel1),
# html.Div(className='two', children=panel2),
# html.Div(className='three', children=panel3),
# html.Div(className='four', children=panel4),
# html.Div(className='five', children=panel5),
# html.Div(className='six', children=panel6),
# ])
# ])
# if __name__ == '__main__':
# app.run_server(debug=True)
# !where python
# import os, signal
# os.kill(os.getpid(), signal.SIGTERM)
As you can see, not only can we use IPA to gather large amounts of bespoke, calculated, values, but be can also portray this insight in a simple, quick and relevent way. The last cell in particular loops through our built fundction to give an updated graph every 5 seconds using 'legacy' technologies that would work in most environments (e.g.: Eikon Codebook).
Brilliant: Black-Scholes-Merton
What is the RIC syntax for options in Refinitiv Eikon?
Functions to find Option RICs traded on different exchanges
Making your code faster: Cython and parallel processing in the Jupyter Notebook
What Happens to Options When a Stock Splits?
Select column that has the fewest NA values
Return Column(s) if they Have a certain Percentage of NaN Values (Python)
How to Split a String Between Numbers and Letters?
RIC nomenclature for expired Options on Futures
Expiration Dates for Expired Options API
Measure runtime of a Jupyter Notebook code cellMeasure runtime of a Jupyter Notebook code cell
# import refinitiv.data as rd
# rd.get_config().set_param(
# param=f"logs.transports.console.enabled", value=True
# )
# session = rd.open_session("desktop.workspace")
# session.set_log_level("DEBUG")
# SPX_test2OptnMrktPrice10min = rd.get_history(
# universe=list(SPX_test2['valid_ric'][0].keys())[0],
# fields=["TRDPRC_1"],
# interval="10min",
# start='2021-10-25T11:53:09.168706',
# end='2021-10-26T19:53:10.166926') # Ought to always end at 8 pm for OPRA exchanged Options, more info in the article below
# SPX_test2OptnMrktPrice
# SPX_test2OptnMrktPrice1d = rd.content.historical_pricing.summaries.Definition(
# universe=list(SPX_test2['valid_ric'][0].keys())[0],
# start='2021-10-23',
# end='2021-10-29',
# fields=['BID', 'ASK', 'TRDPRC_1'],
# interval=rd.content.historical_pricing.Intervals.DAILY).get_data()
SPX_test2OptnMrktPrice1d.data.df
# SPX_test2OptnMrktPriceTest = rd.content.historical_pricing.summaries.Definition(
# interval=rd.content.historical_pricing.Intervals.FIVE_MINUTES,
# # interval='PT1H',
# universe=list(SPX_test2['valid_ric'][0].keys())[0],
# start='2021-10-25T10:53:09.168706',
# end='2021-10-27T19:53:10.166926',
# fields=['BID', 'ASK', 'TRDPRC_1']).get_data()
# SPX_test2OptnMrktPriceTest.errors
# SPX_test2OptnMrktPriceTest.get_data().data.df
# SPX_test2OptnMrktPriceTest
ek = rd.eikon
# The key is placed in a text file so that it may be used in this code without showing it itself:
eikon_key = open("eikon.txt", "r")
ek.set_app_key(str(eikon_key.read()))
# It is best to close the files we opened in order to make sure that we don't stop any other services/programs from accessing them if they need to:
eikon_key.close()
testDf, err = ek.get_data(
instruments=list(SPX_test2['valid_ric'][0].keys())[0],
fields=['TRDPRC_1'],
parameters={
'SDate': '2021-10-25T10:53:09.168706',
'EDate': '2021-10-28T19:53:10.166926'})
testDf
We're lucky in that indexes's prices are adjusted automatically to corporate actions, so we do't have to take them into account in our calculations. However, we would have to if we focussed on Equities. If this is the case you're interested in, don't hesitate to use the below which should work with the functions above:
def Get_trans_days(year, mcal_get_calendar='EUREX', trans_day='first'):
'''
Get_trans_days Version 2.0:
This function gets transaction days for each month of a specified year.
Changes
----------------------------------------------
Changed from Version 1.0 to 2.0: Jonathan Legrand changed Haykaz Aramyan's original code to allow
(i) function name changed from `get_trans_days` to `Get_trans_days`
(ii) for the function's holiday argument to be changed, allowing for any calendar supported by `mcal.get_calendar` and defaulted to 'EUREX' as opposed to 'CBOE_Index_Options' and
Dependencies
----------------------------------------------
import datetime.timedelta as timedelta. (This is a native Python library, so it ought to be version '3.8.12'.)
Python library 'pandas_market_calendars' version '3.2'.
pandas_market_calendars as mcal version '4.1.0'
Parameters
-----------------------------------------------
Arguments:
year (int):
Year for which transaction days are requested
mcal_get_calendar(str):
String of the calendar for which holidays have to be taken into account. More on this calendar (link to Github checked 2022-10-11): https://github.com/rsheftel/pandas_market_calendars/blob/177e7922c7df5ad249b0d066b5c9e730a3ee8596/pandas_market_calendars/exchange_calendar_cboe.py
Default: mcal_get_calendar='EUREX'
trans_day (str):
Takes either 'first' or 'third' indicating to the first business day or the 3rd Friday of a month respectively
Default: trans_day='first'
Output:
trans_days (list):
List of days for 12 month
'''
# get the first business day of each month
if trans_day == 'first':
mkt = mcal.get_calendar(mcal_get_calendar)
holidays = mkt.holidays().holidays
# set start and end day ranges
start_date, end_date = f"{str(year)}-01-01", f"{str(year)}-12-31"
trans_days = []
for date in pd.date_range(start_date, end_date, freq='BMS'):
# get the first day after the weekend after checking for holiday
while date.isoweekday() > 5 or date in holidays:
date += timedelta(1)
# add found day to the list
trans_days.append(date.date().day)
# get the 3rd Friday for each month by calling function "get_exp_dates"
elif trans_day == 'third':
trans_days = get_exp_dates(year)[year]
else:
print('Please input "first" or "third" for transaction day')
return
return trans_days
def Adjustment_factor(corp_event, year=None, date=None, trans_day='first', mcal_get_calendar='EUREX'):
'''
Adjustment_factor Version 2.0:
This function gets adjustment factor(s) of stock split for a given asset. If no split event is happened during the requested period
function returns 1(if date argument is used) or list of twelve 1s (if year argument is used), which assumes no adjustment in prices.
Changes
----------------------------------------------
Changed from Version 1.0 to 2.0: Jonathan Legrand changed Haykaz Aramyan's original code:
(i) function name changed from `adjustment_factor` to `Adjustment_factor`.
Dependencies
----------------------------------------------
Python library 'pandas_market_calendars' version 3.2
Parameters
-----------------------------------------------
Input:
asset (str):
RIC code of the asset
year (int):
Year for which stock split events are requested
date (str with date (YYYY-MM-DD) format):
Date as of which stock split events are requested
trans_day (str, default = 'first'):
Indicates the date of the transaction for get_trans_days function
mcal_get_calendar(str):
String of the calendar for which holidays have to be taken into account. More on this calendar (link to Github checked 2022-10-11): https://github.com/rsheftel/pandas_market_calendars/blob/177e7922c7df5ad249b0d066b5c9e730a3ee8596/pandas_market_calendars/exchange_calendar_cboe.py
Default: mcal_get_calendar='EUREX'
Output:
adj_factor (float): This is returned in case of date argument is used. The output is the Adjustment factor after split
adj_factors(list): This is returned in case of year argument is used. The output is the list of Adjustment factors after split for each month
'''
# if there is no stock split corporate event
if (corp_event is None) or (corp_event['Capital Change Effective Date'][0] is None):
if year is not None and date is None:
# return list of 1s if year argument is used
adj_factors = 12 * [1]
return adj_factors
elif date is not None and year is None:
# return 1 if exact date argument is used
adj_factor = 1
return adj_factor
else:
print('Either Year or exact date needs to be passed to the function')
# if there is an event adjustment factor(s) is(are) calculated
else:
if year is not None and date is None: # in case of year argument is used
# request transaction dates
trans_days = Get_trans_days(year=year, mcal_get_calendar='EUREX', trans_day=trans_day)
adj_factors = []
for i in range(1, 13):
# get exp_dates and use it as a request date for stock split corporate events
exp_date = str(year) + '-' + str(i) + '-' + str(trans_days[i - 1])
# initiate adj_factor with 1
adj_factor = 1
# we first check if the expiration date of option is after or before the adjustment date
for j in reversed(range(len(corp_event))):
# if expiration date is smaller than adjustment date then we need adjustment
if pd.to_datetime(exp_date).strftime('%Y-%m-%d') < pd.to_datetime(corp_event['Capital Change Effective Date'][j]).strftime('%Y-%m-%d'):
adj_factor = float(corp_event['Adjustment Factor'][j]) * adj_factor # we should consider all adjustment factors which are after the expiration day
# append adjustment factor of the month to the list
adj_factors.append(adj_factor)
return adj_factors
elif date is not None and year is None: # In case exact date argument is ued
adj_factor = 1
for j in reversed(range(len(corp_event))):
# if expiration date is smaller than adjustment date then we need adjustment
if pd.to_datetime(date).strftime('%Y-%m-%d') < corp_event['Capital Change Effective Date'][j]:
adj_factor = float(corp_event['Adjustment Factor'][j]) * adj_factor
return adj_factor
else:
print('Either Year or exact date needs to be passed to the function')
def Get_potential_rics(year, trans_day, asset, OTM_size, diff, opt_type, reportLog=True, debug=False):
'''
Get_potential_rics Version 2.0:
This function returns the list of potential option RICs for a specified year reconstructed based on Refinitiv RIC and option trading rules.
Changes
----------------------------------------------
Changed from Version 1.0 to 2.0: Jonathan Legrand changed Haykaz Aramyan's original code:
(i) function name changed from `get_potential_rics` to `Get_potential_rics`
(ii) changed function body to reflect changed funciton name from `get_trans_days` to `Get_trans_days`
(iii) added argument `reportLog`, a boolean value that, if left to default `True`, report log of the function output
Dependencies
----------------------------------------------
Python library 'Refinitiv Dataplatform' version 1.0.0a8.post1
Parameters
-----------------------------------------------
Input:
year (int): year for which transaction days are requested
trans_day (str, default = 'first'): takes either 'first' or 'third' indicating to the first business day or the 3rd Friday of a month respectively
asset (str): RIC code of the asset
OTM_size (int): percentage number indicating how far away is the strike price from the price of the underlying asset
diff (int): Tolarated difference in OTM to construct upper and lower bounds of strike prices
opt_type (str): takes either "call" or "put"
reportLog (bool): report log of the function output if True. Default: reportLog=True.
Output:
Tuple of two objects:
potential_RICs (dict): dictionary containing potential RICs for each month with strike prices from the lower to upper bounds of strikes
strikes (list): list of the strike prices calculated based on OTM size for each month
'''
# open file to report log of the function output
if reportLog:
report = open("Log report.txt", "a")
# call functions to get expiration and transaction days
trans_days = Get_trans_days(
year=year, mcal_get_calendar='EUREX', trans_day=trans_day)
trans_days_prev = Get_trans_days(
year=year-1, mcal_get_calendar='EUREX', trans_day=trans_day)
dates = get_exp_dates(year)
# trim underlying asset's RIC to get the required part for option RIC
if asset[0] == '.': # check if the asset is an index or an equity
asset_name = asset[1:] # get the asset name - we remove "." symbol for index options
adj_factors = 12 * [1] # set adjustment factors to be equal to 1 for each month (no stock split corporate event is applicable to indices)
else:
asset_name = asset.split('.')[0] # we need only the first part of the RICs for equities
# get list of corporate events for equities
if debug: print(f"asset: {asset}")
corp_event = rd.get_data(
universe=asset,
fields=["TR.CAEffectiveDate", "TR.CAAdjustmentFactor", "TR.CAAdjustmentType"],
parameters={
"CAEventType": "SSP",
"SDate": datetime.today().strftime("%Y-%m-%d"),
"EDate": "-50Y"})
if debug: print(f"corp_event: {corp_event}")
# run adjustment_factor function to get the factors
adj_factors = Adjustment_factor(corp_event, year=year, trans_day=trans_day)
# define expiration month codes to be used after "^" sign
exp = ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L']
potential_RICs = {}
strikes = []
# construct potential RICs for each month of a specified year
for j in range(1, 13):
# get day of expiration for a month
day = dates[year][j - 1]
# get date of price request, which is in the previous month of expiration
if j != 1:
date = str(year) + '-' + str(j - 1) + '-' + str(trans_days[j - 2])
if j == 1: # for January, we need to subtract a year along with the month
date = str(year - 1) + '-' + str(j + 11) + '-' + str(trans_days_prev[j + 10])
# get price of underlying asset as of the transaction date
# get the corresponding adjustment factor for the month
adj_factor = adj_factors[j-1]
price = rd.get_data(
asset,
fields=['TR.PriceClose'],
parameters={'SDate': date})
price = float(price.iloc[0, 1]) / adj_factor # adjust prices by the adjustment factor. if no sptick split events adj_factor = 1
# calculate the strike price for call options
if opt_type.lower() == 'call':
strike = price + price * OTM_size / 100
# define expiration month codes for call options while also considering the strike price
if strike > 999.999:
exp_codes_call = [
'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l']
else:
exp_codes_call = [
'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L']
# get expiration month code for a month
exp_month = exp_codes_call[j-1]
# calculate the strike price and get expiration month code for a month for put options
elif opt_type.lower() == 'put':
strike = price - price * OTM_size/100
if strike > 999.999:
exp_codes_put = [
'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x']
else:
exp_codes_put = [
'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X']
exp_month = exp_codes_put[j-1]
strikes.append(int(round(strike, 0))) # append the calculated strike price to the list of strikes
# calculate lower and upper bounds for strikes considering the value of the strike
if strike > 999.999:
step = 5 # we loop over strikes with a step 5 for larger strikes
strike_ub = int(round((strike + strike * diff / 100),-1))
strike_lb = int(round((strike - strike * diff / 100),-1))
else:
step = 1 # we loop over strikes with a step 1 for smaller strikes
strike_ub = int(strike + strike * diff / 100)
strike_lb = int(strike - strike * diff / 100)
# construct RICs for each strike from the lower to upper bound ranges of strikes
for n in range(strike_lb, strike_ub + step, step):
k = None # for strikes < 1000 along with 1 step increment change in strikes we do 0.5 point increment change which
# allows us to consider strikes with decimal points. This is important to get closer OTMs for smaller valued assets.
# here we construct option RICs by adding together all the RIC components
# Please note some of the components are different depending on the strike value
plc_holdr1 = asset_name + exp_month + str(day) + str(year)[-2:]
plc_holdr2 = exp[j - 1] + str(year)[-2:]
if n < 10:
z = plc_holdr1 + '00' + str(n) + '00.U^' + plc_holdr2# for integer steps
k = plc_holdr1 + '00' + str(n) + '50.U^' + plc_holdr2# for decimal steps
elif n >= 10 and n < 100:
z = plc_holdr1 + '0' + str(n) + '00.U^' + plc_holdr2
k = plc_holdr1 + '0' + str(n) + '50.U^' + plc_holdr2
if n >= 100 and n < 1000:
z = plc_holdr1 + str(n) + '00.U^' + plc_holdr2
k = plc_holdr1 + str(n) + '50.U^' + plc_holdr2
elif n >= 1000 and n < 10000:
z = plc_holdr1 + str(n) + '0.U^' + plc_holdr2
elif n >= 10000 and n < 20000:
z = plc_holdr1 + 'A' + str(n)[-4:] + '.U^' + plc_holdr2
elif n >= 20000 and n < 30000:
z = plc_holdr1 + 'B' + str(n)[-4:] + '.U^' + plc_holdr2
elif n >= 30000 and n < 40000:
z = plc_holdr1 + 'C' + str(n)[-4:] + '.U^' + plc_holdr2
elif n >= 40000 and n < 50000:
z = plc_holdr1 + 'D' + str(n)[-4:] + '.U^' + plc_holdr2
# append RICs with integer strikes to the dictionary
if j in potential_RICs:
potential_RICs[j].append(z)
# append RICs with decimal point strikes to the dictionary
if k is not None:
potential_RICs[j].append(k)
else:
potential_RICs[j] = [z]
if k is not None:
potential_RICs[j].append(k)
# report funtion results and close the log file
if reportLog:
now = {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}
report.write(f'{now}: Potential RICs for {opt_type} options with {OTM_size}% OTM for {year} are constructed\n')
report.close()
return potential_RICs, strikes
call_RICs, call_strikes = Get_potential_rics(
year=2017,
trans_day='first',
asset='.STOXX50E',
OTM_size=5,
diff=3,
opt_type='call',
reportLog=True,
debug=False)
print(call_strikes)
rd.get_data(
'.STOXX50E',
fields=['TR.CAAdjustmentFactor(SDate=2017-01-01,EDate=2018-12-31)'])
I then gathered data on two Options, a live one ('STXE42000D3.EX') and an expired one ('HSI19300N3.HF^B23').
import refinitiv.data as rd
import refinitiv.data.content.ipa.financial_contracts as rdf
from refinitiv.data.content.ipa.financial_contracts import option
import pandas as pd
# Let's authenticate ourseves to LSEG's Data and Analytics service, Refinitiv:
try: # The following libraries are not available in Codebook, thus this try loop
rd.open_session(config_name="C:\\Example.DataLibrary.Python-main\\Example.DataLibrary.Python-main\\Configuration\\refinitiv-data.config.json")
rd.open_session("desktop.workspace")
except:
rd.open_session()
[2023-03-28 18:31:23,920] - [DEBUG] - [sessions.platform.rdp.4] - [2976] | MainThread
+ Session created: PlatformSession
name = rdp
server_mode = False
stream_auto_reconnection = True
signon_control = True
authentication_token_endpoint_url = https://api.refinitiv.com/auth/oauth2/v1/token
[2023-03-28 18:31:23,922] - [DEBUG] - [sessions.platform.rdp.4] - [2976] | MainThread
Open session
[2023-03-28 18:31:23,976] - [DEBUG] - [sessions.platform.rdp.4] - [2976] | MainThread
Created session connection SessionCxnType.REFINITIV_DATA
[2023-03-28 18:31:23,978] - [DEBUG] - [sessions.platform.rdp.4] - [2976] | MainThread
AuthManager: start authorize
[2023-03-28 18:31:23,980] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
AuthManager: Access token will be requested in 1e-05 seconds
[2023-03-28 18:31:24,002] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
Request to https://api.refinitiv.com/auth/oauth2/v1/token
method = POST
headers = {'Accept': 'application/json', 'x-tr-applicationid': '383058f500e44980a576a7e02b0b187f8856e7ac'}
params = None
cookies = None
data = {'scope': 'trapi', 'grant_type': 'password', 'username': 'GE-A-01103867-3-7874', 'password': ********, 'takeExclusiveSignOnControl': 'true', 'client_id': '383058f500e44980a576a7e02b0b187f8856e7ac'}
json = None
[2023-03-28 18:31:24,833] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
HTTP request response 200: { "access_token":"eyJ0eXAiOiJhdCtqd3QiLCJhbGciOiJSUzI1NiIsImtpZCI6ImRMdFd2Q0tCSC1NclVyWm9YMXFod2pZQ2t1eDV0V2ZSS2o4ME9vcjdUY28ifQ.eyJkYXRhIjoie1wiY2lwaGVydGV4dFwiOlwiT0RrX0k4OXpJeGZERjR5M0tuZVhELVoxQWdQc3d0QWx0NGZmX2dhUG1CTHZnZmdNRk9CaWU5Z2E5ZkJqUEdoZ2lMY2xjenV3MkV0ekdaYjYwVFB0bnpzWWVuZHVBejVBMXlNZjhrdzQ1bHhlS280R1pmYzJkZXhfcFRQY3JsZW5fT2thLUhmdEZzMzZFeERyZDFyVlZIMmtvZ3dWS1dJdGRBVWZiOGhOdUxVWFM5TGVjM1ZKbTFaZEVtOGtvdm5nZkIwbWEzT3ZlX05DX2JBY3hjQzkzNmlabzMtbW1NWldDYVhCZ2dBVjhJekVNdi1WVURzc0VIVWdIXy1NSFQ3WWxqSnFKX3gySVFxWEhJbHAwbk9xc3VndTZhUzM0U2hwU0hOWVZZMnRhaDFHQXoyelpCc19UcDhHVzRXcU42T3Nld1ZobG9SUVBCTjNZbEJyQmdCbmFXY3pxR0U0OE96cnl1Q0FQamRQeGhHdEJ4M0RuVDhxNU9LYzRTVVFJTGMtTVRUU2NuUFRMbGV5d0szTloxdmdPUTZyXzJfRVdqVWo4c2NnVTFTOTV3S0d2Z0oyczFHTmUxNzFoNzhXaFo1WmM3bWk5ZVhrelhsSTR1UU1DNndMVUNTZ2NNb01qZGNTekp1eE1pdnZoVUtMazZ2U2xYeTVWQ2Z4WTVMU2Z5eHk3THMwZ2VjS0J1VHdHb3FUTEZpVHJuaXpNYU1FT0NGRXJfTnA5WC1NMk9XaUZFQ1ZaZDMyQW56aWhyendwZHVxSGRfUFlwSlhpUkl3XzRVcTloZ0lmb0tHbjlqdzRJYnNvTHJ4aXFTM2VJR1FYRU94b1RrendGSzlZSk51Mmk0REhzME1yMDFCMFlZU1dteW03MDZoc25XS1hJMkhQQ3Q5bTlzdXVUUjZGRTRvdTBPck53MDMteHVEakI0OHhoSno1R2xudTZpODBZekpDSVRqenRDeUJfb24tSHF1LXd1bUQ3Sk15ZWFQamN5RzlhMXJHSmp1Yl9Ha2lCZ1hGTzczWnVyTjhvWkVNYnJwdzhRSGRueWNjZEZXRUNtYnJjQ3A0VFQ1dEtKa1VpSXhrN3ZlNGRmUnRhVzJZQUNYYnBkZ2Q4N0NkejdSX1ZxMXVtX0pCbV9IQzlqWWprTHo4RnU3RXZPckZFUElEdW1EcEJLNTZYcHRIXzZ0aXBOcVZObmFTV0hpSTJfeXRHRjBwc3pMYkNGb0hwUXlVdTVZWGVoZUNVdVFJbGppQnVtTFdUMEQ5V01vYlJJYmVfM05oOEs1VkstQ21wbEtmMGNnWkw1X2IzMk1UTmFuQmZEakE2My1oNkZrbkdQb21ibmRySGg5WFlsUnRXbjdrX2NidlVSdVNqY0VHLWJna0hkMHh6Ul82bXMzdnU5Z2dTUmZyajFHdllsTWNsSjE0cmtDbjRtTU56TnJkcUxaamJ1YUladHJ6RHRYYnNJQml5ZkdpNVhDbTlxamFvR3hzX0cyeEFnSlJ1UV8tWnZVeGNOWEV6dFZFdUgtaF9mUkM2UExXOERGc1NpNDVpbXVBQS1TZURmdzR3VTlYY2Q0N2JOVnJuaTVmNWpjTGJ6SldPZXEybFc2VjF4UjRwSjFOdnU4RzExYzlGa2pIR2owWGN2X1NSRk5leEU0aXF0VEZVRk1KSE5HMnBSdVpLbnNjTEM1aTVUbGpKeFE3UXFBYjBaWmZSNWloRWtRSnJWOU9vWFJhV18tdXF3WGlwaTZzT3JTbldvZ1poei1ZY1pGQXpFS2xHQ2RiVXVhb29oRnBtTndNY1hXWjdOM0xObGJ1UTdsZ1hFeDB2MExiYmMwUndPTVlIM3V1VXNzUlUxQkpqWUJZdzlBVnRfaDFfWS1yNmhtMVZZS0RDcWlPOXNPNGFuRzdXelFxcEVJcEwtRmlvMWZtdlNCSDdGaW5pS1lEREdsS0RxRjhpUWVwcEJDcGxaSnhWQVNrT3NfT1J2TGlDeFhOS2JQN25JUkFDY21kWVJCbmJ5X3JDY2w3Q3hRbVlLM0tWQjdzb1FRZVlWR3JMeUVLLWxzMzh3TnNMWnlXWXgxbVM5MDN2cl9zcExFVDY4Z3cxWElhUWlrMWd1T2RJNHpPamc5Rk9QdllDMGJTb3JtUHZ0Q0txdW1DSjdiUWdVSGVOU0dXMlM4OUszWnpsUjlsUl96ZWtJNGtONjQzZFU3cFZCdkZsMHo0QkJHVTlUbGlWM2x5cTBMYlhUaElONmc0OHUxdXdjX1VPN21HVXN3LVVJMlwiLFwiaXZcIjpcIk5QdG4yS2o5OFlsbzg4REZcIixcInByb3RlY3RlZFwiOlwiZXlKaGJHY2lPaUpCVjFOZlJVNURYMU5FUzE5Qk1qVTJJaXdpWlc1aklqb2lRVEkxTmtkRFRTSXNJbnBwY0NJNklrUkZSaUo5XCIsXCJyZWNpcGllbnRzXCI6W3tcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUJBSGlPUmNhUGhfaVozbWtqRW1RNkFoekM5Ykd4ckk0X1doVkhYNU9GcjY1ODBRSGFCSUw0R2lJZ1YzRDNLa1gycmVYQ0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTVVfNXltS2RGNmpXc1V6S25BZ0VRZ0R0OGRmWmFwWlNlTVJ4cGtPYWlSLXM0c3FXT0hJT0lic0EyRnkzMk5oQllZWjNRbVR0dzJacEZrMXBGSUFFOVpLQVg5dW53RXQyRktuMFFYQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczp1cy1lYXN0LTE6ODk4MDg0OTg0ODc3OmtleS8xZmZmNjc5My02NWRlLTQ3YzQtYjc2Ni03NmNkM2MxOTRlZjFcIn19LHtcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUJBSGlPUmNhUGhfaVozbWtqRW1RNkFoekM5Ykd4ckk0X1doVkhYNU9GcjY1ODBRSGFCSUw0R2lJZ1YzRDNLa1gycmVYQ0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTVVfNXltS2RGNmpXc1V6S25BZ0VRZ0R0OGRmWmFwWlNlTVJ4cGtPYWlSLXM0c3FXT0hJT0lic0EyRnkzMk5oQllZWjNRbVR0dzJacEZrMXBGSUFFOVpLQVg5dW53RXQyRktuMFFYQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczp1cy1lYXN0LTE6ODk4MDg0OTg0ODc3OmtleS8xZmZmNjc5My02NWRlLTQ3YzQtYjc2Ni03NmNkM2MxOTRlZjFcIn19LHtcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUNBSGhXdmNIWHJNWVdVeWRWcmxnNXVNbF9RaUg2Z0t0c1puZFhCWDJqSG42OTN3RzE4UjFmaVRwTFYxbVRnb1RfS1BNM0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTTlRbExOa2xDWXdHTUpvVWxBZ0VRZ0RzWW5BUDdkNTNWWXk3cGswRW9jaXlvamE0NDhLQTgwZExETzVVX0hCNk9CMDg2eUl6NGtmdlJYTzJ2ZFVUN09wU2xNeTM0b2F6RGgzckkwQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczpldS13ZXN0LTE6ODk4MDg0OTg0ODc3OmtleS82YTg2NTkyZi0zNjE0LTQ4MTYtOWM0My04MTFiZWZkYTZkNjBcIn19XSxcInRhZ1wiOlwiM3VEV0Y2RDhzTmNDV2EyeThjRDdxQVwifSIsInJzMSI6ImI5MTE2ZjMzMmVmNzAyYjkyN2FhMWIyMjM1ZWYzM2FiZWI3OTVjZGYiLCJhdWQiOiIzODMwNThmNTAwZTQ0OTgwYTU3NmE3ZTAyYjBiMTg3Zjg4NTZlN2FjIiwiaXNzIjoiaHR0cHM6Ly9pZGVudGl0eS5jaWFtLnJlZmluaXRpdi5jb20vYXBpL2lkZW50aXR5L3N0c19wcm9kIiwiZXhwIjoxNjgwMDIxNjM5LCJpYXQiOjE2ODAwMjEwMzl9.He3VKCIicLFn9AdC6LEkmNW-8-C9dfscpDZnJjn_7Z-7D6vtzfVeYPBcOHMgqQX44b8QK2H_yP5-5eXLzYp41kVq1TrSxjui9TJn56ukLkB330pTpR85uJbepgxPFRLLj8pYy1FqWWzcpSUtiz7FE1ZNGpz85NZJx44KrTkkCGyO-gnTdltRb3s7UWy0LN3LGmZ0fQDPIk4HMUV2SXRzfpnsu54FbgR8aM03w_L-D4_VXCNLpidAddNs9R2YzafyNnSquCZ3BN7xKGuePCfiRxnPYde5d__vCvOF7we5nZeisl5HhUVW3q7_xf6lgOT3A3vrWRlgFVcdw766MXnC5w", "refresh_token":"b188a7f6-1c83-4cf6-9c7b-056a8a4d7c0d", "expires_in":"600" , "scope":"trapi.admin.accounts trapi.admin.accounts.groups trapi.admin.accounts.licenses trapi.admin.accounts.users trapi.alerts.history.crud trapi.alerts.news-headlines-1h.crud trapi.alerts.news-headlines-24h.crud trapi.alerts.news-headlines.crud trapi.alerts.news-stories.crud trapi.alerts.news.crud trapi.alerts.preferences.crud trapi.alerts.publication.crud trapi.alerts.research.crud trapi.alerts.subscription.crud trapi.auth.cloud-credentials trapi.autosuggest.metadata.read trapi.autosuggest.read trapi.cfs.claimcheck.read trapi.cfs.claimcheck.write trapi.cfs.publisher.read trapi.cfs.publisher.setup.write trapi.cfs.publisher.stream.write trapi.cfs.publisher.write trapi.cfs.subscriber.read trapi.data.3pri.test.read trapi.data.aggregates.read trapi.data.api.test trapi.data.average-volume-analytics.ava_read trapi.data.benchmark.bmk_read trapi.data.esg.bulk.read trapi.data.esg.metadata.read trapi.data.esg.read trapi.data.esg.universe.read trapi.data.esg.views-basic.read trapi.data.esg.views-measures-full.read trapi.data.esg.views-measures-standard.read trapi.data.esg.views-measures.read trapi.data.esg.views-scores-full.read trapi.data.esg.views-scores-standard.read trapi.data.esg.views-scores.read trapi.data.filings.metadata trapi.data.filings.retrieval trapi.data.filings.search trapi.data.funds.assets.read trapi.data.historical-pricing.events.read trapi.data.historical-pricing.read trapi.data.historical-pricing.summaries.read trapi.data.ms-analytics.bond-liquidity-starmine trapi.data.news-story-viewer.read trapi.data.news.read trapi.data.ownership.adv trapi.data.ownership.basic trapi.data.portfolios.read trapi.data.portfolios.write trapi.data.pricing.read trapi.data.quantitative-analytics.read trapi.data.research.read trapi.data.symbology.advanced.read trapi.data.symbology.bulk.read trapi.data.symbology.read trapi.data.trading-analytics.reporting.fitca trapi.data.trading-analytics.reporting.rts27 trapi.data.trading-analytics.reporting.rts28 trapi.data.trading-analytics.reporting.si trapi.data.trading-analytics.tds.history.fix trapi.data.trading-analytics.tds.history.redi trapi.data.trading-analytics.tds.report.redi trapi.data.transactions.qualid.read trapi.data.verify.document.qualid.read trapi.data.verify.identity.qualid.read trapi.data.verify.idv.business.read trapi.data.verify.liveness.qualid.read trapi.data.verify.screening.qualid.read trapi.database-subscriptions.crud trapi.database-subscriptions.users.crud trapi.deployment.crud trapi.graphql.subscriber.access trapi.hvmi.prof.crud trapi.hvmi.prof.read trapi.hvmi.ret.crud trapi.hvmi.ret.read trapi.metadata.cdfsto.read trapi.metadata.cdfsto.write trapi.metadata.nsa.read trapi.metadata.read trapi.metadata.write trapi.msg-svcs.eod-pricing.crud trapi.msg-svcs.readership-company.crud trapi.msg-svcs.readership-story.crud trapi.platform.iam.accounts trapi.platform.iam.app_id trapi.platform.iam.bulk-transactions trapi.platform.iam.guiss trapi.platform.iam.product-maintenance trapi.platform.iam.users trapi.platform.license-management trapi.platform.license.mgmt trapi.platform.reports trapi.platform.reports.admin trapi.platform.reports.permissions trapi.platform.reports.usage trapi.rsa.lowlevel.read trapi.rsa.lowlevel.write trapi.search.explore.read trapi.search.lookup.read trapi.search.metadata.read trapi.search.read trapi.searchcore.lookup.read trapi.searchcore.metadata.read trapi.searchcore.read trapi.searchlite.lookup.read trapi.searchlite.metadata.read trapi.searchlite.read trapi.serviceinsight.access trapi.streaming.prcperf.read trapi.streaming.pricing.read trapi.streaming.synthetic.read trapi.streaming.trading-analytics.tds.provider.redi trapi.streaming.trading-analytics.tds.stream.fix trapi.streaming.trading-analytics.tds.stream.redi trapi.synthetic.crud trapi.trading.auctions trapi.transfer-job.ctrl trapi.user-framework.application-metadata.raplib trapi.user-framework.mobile.crud trapi.user-framework.recently-used.crud trapi.user-framework.workspace.crud trapi.userdata.lists.read trapi.userdata.lists.write trapi.wcng.dug.read trapi.wealth.clientdata trapi.wealth.org.reference trapi.wealth.plaid", "token_type":"Bearer" }
[2023-03-28 18:31:24,834] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
Latency: 0.8320982456207275 sec.
Access token response: { "access_token":"eyJ0eXAiOiJhdCtqd3QiLCJhbGciOiJSUzI1NiIsImtpZCI6ImRMdFd2Q0tCSC1NclVyWm9YMXFod2pZQ2t1eDV0V2ZSS2o4ME9vcjdUY28ifQ.eyJkYXRhIjoie1wiY2lwaGVydGV4dFwiOlwiT0RrX0k4OXpJeGZERjR5M0tuZVhELVoxQWdQc3d0QWx0NGZmX2dhUG1CTHZnZmdNRk9CaWU5Z2E5ZkJqUEdoZ2lMY2xjenV3MkV0ekdaYjYwVFB0bnpzWWVuZHVBejVBMXlNZjhrdzQ1bHhlS280R1pmYzJkZXhfcFRQY3JsZW5fT2thLUhmdEZzMzZFeERyZDFyVlZIMmtvZ3dWS1dJdGRBVWZiOGhOdUxVWFM5TGVjM1ZKbTFaZEVtOGtvdm5nZkIwbWEzT3ZlX05DX2JBY3hjQzkzNmlabzMtbW1NWldDYVhCZ2dBVjhJekVNdi1WVURzc0VIVWdIXy1NSFQ3WWxqSnFKX3gySVFxWEhJbHAwbk9xc3VndTZhUzM0U2hwU0hOWVZZMnRhaDFHQXoyelpCc19UcDhHVzRXcU42T3Nld1ZobG9SUVBCTjNZbEJyQmdCbmFXY3pxR0U0OE96cnl1Q0FQamRQeGhHdEJ4M0RuVDhxNU9LYzRTVVFJTGMtTVRUU2NuUFRMbGV5d0szTloxdmdPUTZyXzJfRVdqVWo4c2NnVTFTOTV3S0d2Z0oyczFHTmUxNzFoNzhXaFo1WmM3bWk5ZVhrelhsSTR1UU1DNndMVUNTZ2NNb01qZGNTekp1eE1pdnZoVUtMazZ2U2xYeTVWQ2Z4WTVMU2Z5eHk3THMwZ2VjS0J1VHdHb3FUTEZpVHJuaXpNYU1FT0NGRXJfTnA5WC1NMk9XaUZFQ1ZaZDMyQW56aWhyendwZHVxSGRfUFlwSlhpUkl3XzRVcTloZ0lmb0tHbjlqdzRJYnNvTHJ4aXFTM2VJR1FYRU94b1RrendGSzlZSk51Mmk0REhzME1yMDFCMFlZU1dteW03MDZoc25XS1hJMkhQQ3Q5bTlzdXVUUjZGRTRvdTBPck53MDMteHVEakI0OHhoSno1R2xudTZpODBZekpDSVRqenRDeUJfb24tSHF1LXd1bUQ3Sk15ZWFQamN5RzlhMXJHSmp1Yl9Ha2lCZ1hGTzczWnVyTjhvWkVNYnJwdzhRSGRueWNjZEZXRUNtYnJjQ3A0VFQ1dEtKa1VpSXhrN3ZlNGRmUnRhVzJZQUNYYnBkZ2Q4N0NkejdSX1ZxMXVtX0pCbV9IQzlqWWprTHo4RnU3RXZPckZFUElEdW1EcEJLNTZYcHRIXzZ0aXBOcVZObmFTV0hpSTJfeXRHRjBwc3pMYkNGb0hwUXlVdTVZWGVoZUNVdVFJbGppQnVtTFdUMEQ5V01vYlJJYmVfM05oOEs1VkstQ21wbEtmMGNnWkw1X2IzMk1UTmFuQmZEakE2My1oNkZrbkdQb21ibmRySGg5WFlsUnRXbjdrX2NidlVSdVNqY0VHLWJna0hkMHh6Ul82bXMzdnU5Z2dTUmZyajFHdllsTWNsSjE0cmtDbjRtTU56TnJkcUxaamJ1YUladHJ6RHRYYnNJQml5ZkdpNVhDbTlxamFvR3hzX0cyeEFnSlJ1UV8tWnZVeGNOWEV6dFZFdUgtaF9mUkM2UExXOERGc1NpNDVpbXVBQS1TZURmdzR3VTlYY2Q0N2JOVnJuaTVmNWpjTGJ6SldPZXEybFc2VjF4UjRwSjFOdnU4RzExYzlGa2pIR2owWGN2X1NSRk5leEU0aXF0VEZVRk1KSE5HMnBSdVpLbnNjTEM1aTVUbGpKeFE3UXFBYjBaWmZSNWloRWtRSnJWOU9vWFJhV18tdXF3WGlwaTZzT3JTbldvZ1poei1ZY1pGQXpFS2xHQ2RiVXVhb29oRnBtTndNY1hXWjdOM0xObGJ1UTdsZ1hFeDB2MExiYmMwUndPTVlIM3V1VXNzUlUxQkpqWUJZdzlBVnRfaDFfWS1yNmhtMVZZS0RDcWlPOXNPNGFuRzdXelFxcEVJcEwtRmlvMWZtdlNCSDdGaW5pS1lEREdsS0RxRjhpUWVwcEJDcGxaSnhWQVNrT3NfT1J2TGlDeFhOS2JQN25JUkFDY21kWVJCbmJ5X3JDY2w3Q3hRbVlLM0tWQjdzb1FRZVlWR3JMeUVLLWxzMzh3TnNMWnlXWXgxbVM5MDN2cl9zcExFVDY4Z3cxWElhUWlrMWd1T2RJNHpPamc5Rk9QdllDMGJTb3JtUHZ0Q0txdW1DSjdiUWdVSGVOU0dXMlM4OUszWnpsUjlsUl96ZWtJNGtONjQzZFU3cFZCdkZsMHo0QkJHVTlUbGlWM2x5cTBMYlhUaElONmc0OHUxdXdjX1VPN21HVXN3LVVJMlwiLFwiaXZcIjpcIk5QdG4yS2o5OFlsbzg4REZcIixcInByb3RlY3RlZFwiOlwiZXlKaGJHY2lPaUpCVjFOZlJVNURYMU5FUzE5Qk1qVTJJaXdpWlc1aklqb2lRVEkxTmtkRFRTSXNJbnBwY0NJNklrUkZSaUo5XCIsXCJyZWNpcGllbnRzXCI6W3tcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUJBSGlPUmNhUGhfaVozbWtqRW1RNkFoekM5Ykd4ckk0X1doVkhYNU9GcjY1ODBRSGFCSUw0R2lJZ1YzRDNLa1gycmVYQ0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTVVfNXltS2RGNmpXc1V6S25BZ0VRZ0R0OGRmWmFwWlNlTVJ4cGtPYWlSLXM0c3FXT0hJT0lic0EyRnkzMk5oQllZWjNRbVR0dzJacEZrMXBGSUFFOVpLQVg5dW53RXQyRktuMFFYQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczp1cy1lYXN0LTE6ODk4MDg0OTg0ODc3OmtleS8xZmZmNjc5My02NWRlLTQ3YzQtYjc2Ni03NmNkM2MxOTRlZjFcIn19LHtcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUJBSGlPUmNhUGhfaVozbWtqRW1RNkFoekM5Ykd4ckk0X1doVkhYNU9GcjY1ODBRSGFCSUw0R2lJZ1YzRDNLa1gycmVYQ0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTVVfNXltS2RGNmpXc1V6S25BZ0VRZ0R0OGRmWmFwWlNlTVJ4cGtPYWlSLXM0c3FXT0hJT0lic0EyRnkzMk5oQllZWjNRbVR0dzJacEZrMXBGSUFFOVpLQVg5dW53RXQyRktuMFFYQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczp1cy1lYXN0LTE6ODk4MDg0OTg0ODc3OmtleS8xZmZmNjc5My02NWRlLTQ3YzQtYjc2Ni03NmNkM2MxOTRlZjFcIn19LHtcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUNBSGhXdmNIWHJNWVdVeWRWcmxnNXVNbF9RaUg2Z0t0c1puZFhCWDJqSG42OTN3RzE4UjFmaVRwTFYxbVRnb1RfS1BNM0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTTlRbExOa2xDWXdHTUpvVWxBZ0VRZ0RzWW5BUDdkNTNWWXk3cGswRW9jaXlvamE0NDhLQTgwZExETzVVX0hCNk9CMDg2eUl6NGtmdlJYTzJ2ZFVUN09wU2xNeTM0b2F6RGgzckkwQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczpldS13ZXN0LTE6ODk4MDg0OTg0ODc3OmtleS82YTg2NTkyZi0zNjE0LTQ4MTYtOWM0My04MTFiZWZkYTZkNjBcIn19XSxcInRhZ1wiOlwiM3VEV0Y2RDhzTmNDV2EyeThjRDdxQVwifSIsInJzMSI6ImI5MTE2ZjMzMmVmNzAyYjkyN2FhMWIyMjM1ZWYzM2FiZWI3OTVjZGYiLCJhdWQiOiIzODMwNThmNTAwZTQ0OTgwYTU3NmE3ZTAyYjBiMTg3Zjg4NTZlN2FjIiwiaXNzIjoiaHR0cHM6Ly9pZGVudGl0eS5jaWFtLnJlZmluaXRpdi5jb20vYXBpL2lkZW50aXR5L3N0c19wcm9kIiwiZXhwIjoxNjgwMDIxNjM5LCJpYXQiOjE2ODAwMjEwMzl9.He3VKCIicLFn9AdC6LEkmNW-8-C9dfscpDZnJjn_7Z-7D6vtzfVeYPBcOHMgqQX44b8QK2H_yP5-5eXLzYp41kVq1TrSxjui9TJn56ukLkB330pTpR85uJbepgxPFRLLj8pYy1FqWWzcpSUtiz7FE1ZNGpz85NZJx44KrTkkCGyO-gnTdltRb3s7UWy0LN3LGmZ0fQDPIk4HMUV2SXRzfpnsu54FbgR8aM03w_L-D4_VXCNLpidAddNs9R2YzafyNnSquCZ3BN7xKGuePCfiRxnPYde5d__vCvOF7we5nZeisl5HhUVW3q7_xf6lgOT3A3vrWRlgFVcdw766MXnC5w", "refresh_token":"b188a7f6-1c83-4cf6-9c7b-056a8a4d7c0d", "expires_in":"600" , "scope":"trapi.admin.accounts trapi.admin.accounts.groups trapi.admin.accounts.licenses trapi.admin.accounts.users trapi.alerts.history.crud trapi.alerts.news-headlines-1h.crud trapi.alerts.news-headlines-24h.crud trapi.alerts.news-headlines.crud trapi.alerts.news-stories.crud trapi.alerts.news.crud trapi.alerts.preferences.crud trapi.alerts.publication.crud trapi.alerts.research.crud trapi.alerts.subscription.crud trapi.auth.cloud-credentials trapi.autosuggest.metadata.read trapi.autosuggest.read trapi.cfs.claimcheck.read trapi.cfs.claimcheck.write trapi.cfs.publisher.read trapi.cfs.publisher.setup.write trapi.cfs.publisher.stream.write trapi.cfs.publisher.write trapi.cfs.subscriber.read trapi.data.3pri.test.read trapi.data.aggregates.read trapi.data.api.test trapi.data.average-volume-analytics.ava_read trapi.data.benchmark.bmk_read trapi.data.esg.bulk.read trapi.data.esg.metadata.read trapi.data.esg.read trapi.data.esg.universe.read trapi.data.esg.views-basic.read trapi.data.esg.views-measures-full.read trapi.data.esg.views-measures-standard.read trapi.data.esg.views-measures.read trapi.data.esg.views-scores-full.read trapi.data.esg.views-scores-standard.read trapi.data.esg.views-scores.read trapi.data.filings.metadata trapi.data.filings.retrieval trapi.data.filings.search trapi.data.funds.assets.read trapi.data.historical-pricing.events.read trapi.data.historical-pricing.read trapi.data.historical-pricing.summaries.read trapi.data.ms-analytics.bond-liquidity-starmine trapi.data.news-story-viewer.read trapi.data.news.read trapi.data.ownership.adv trapi.data.ownership.basic trapi.data.portfolios.read trapi.data.portfolios.write trapi.data.pricing.read trapi.data.quantitative-analytics.read trapi.data.research.read trapi.data.symbology.advanced.read trapi.data.symbology.bulk.read trapi.data.symbology.read trapi.data.trading-analytics.reporting.fitca trapi.data.trading-analytics.reporting.rts27 trapi.data.trading-analytics.reporting.rts28 trapi.data.trading-analytics.reporting.si trapi.data.trading-analytics.tds.history.fix trapi.data.trading-analytics.tds.history.redi trapi.data.trading-analytics.tds.report.redi trapi.data.transactions.qualid.read trapi.data.verify.document.qualid.read trapi.data.verify.identity.qualid.read trapi.data.verify.idv.business.read trapi.data.verify.liveness.qualid.read trapi.data.verify.screening.qualid.read trapi.database-subscriptions.crud trapi.database-subscriptions.users.crud trapi.deployment.crud trapi.graphql.subscriber.access trapi.hvmi.prof.crud trapi.hvmi.prof.read trapi.hvmi.ret.crud trapi.hvmi.ret.read trapi.metadata.cdfsto.read trapi.metadata.cdfsto.write trapi.metadata.nsa.read trapi.metadata.read trapi.metadata.write trapi.msg-svcs.eod-pricing.crud trapi.msg-svcs.readership-company.crud trapi.msg-svcs.readership-story.crud trapi.platform.iam.accounts trapi.platform.iam.app_id trapi.platform.iam.bulk-transactions trapi.platform.iam.guiss trapi.platform.iam.product-maintenance trapi.platform.iam.users trapi.platform.license-management trapi.platform.license.mgmt trapi.platform.reports trapi.platform.reports.admin trapi.platform.reports.permissions trapi.platform.reports.usage trapi.rsa.lowlevel.read trapi.rsa.lowlevel.write trapi.search.explore.read trapi.search.lookup.read trapi.search.metadata.read trapi.search.read trapi.searchcore.lookup.read trapi.searchcore.metadata.read trapi.searchcore.read trapi.searchlite.lookup.read trapi.searchlite.metadata.read trapi.searchlite.read trapi.serviceinsight.access trapi.streaming.prcperf.read trapi.streaming.pricing.read trapi.streaming.synthetic.read trapi.streaming.trading-analytics.tds.provider.redi trapi.streaming.trading-analytics.tds.stream.fix trapi.streaming.trading-analytics.tds.stream.redi trapi.synthetic.crud trapi.trading.auctions trapi.transfer-job.ctrl trapi.user-framework.application-metadata.raplib trapi.user-framework.mobile.crud trapi.user-framework.recently-used.crud trapi.user-framework.workspace.crud trapi.userdata.lists.read trapi.userdata.lists.write trapi.wcng.dug.read trapi.wealth.clientdata trapi.wealth.org.reference trapi.wealth.plaid", "token_type":"Bearer" }
[2023-03-28 18:31:24,836] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
AuthManager: Access token handler, event: access_token_success, message: All is well
[2023-03-28 18:31:24,837] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
Access token eyJ0eXAiOiJhdCtqd3QiLCJhbGciOiJSUzI1NiIsImtpZCI6ImRMdFd2Q0tCSC1NclVyWm9YMXFod2pZQ2t1eDV0V2ZSS2o4ME9vcjdUY28ifQ.eyJkYXRhIjoie1wiY2lwaGVydGV4dFwiOlwiT0RrX0k4OXpJeGZERjR5M0tuZVhELVoxQWdQc3d0QWx0NGZmX2dhUG1CTHZnZmdNRk9CaWU5Z2E5ZkJqUEdoZ2lMY2xjenV3MkV0ekdaYjYwVFB0bnpzWWVuZHVBejVBMXlNZjhrdzQ1bHhlS280R1pmYzJkZXhfcFRQY3JsZW5fT2thLUhmdEZzMzZFeERyZDFyVlZIMmtvZ3dWS1dJdGRBVWZiOGhOdUxVWFM5TGVjM1ZKbTFaZEVtOGtvdm5nZkIwbWEzT3ZlX05DX2JBY3hjQzkzNmlabzMtbW1NWldDYVhCZ2dBVjhJekVNdi1WVURzc0VIVWdIXy1NSFQ3WWxqSnFKX3gySVFxWEhJbHAwbk9xc3VndTZhUzM0U2hwU0hOWVZZMnRhaDFHQXoyelpCc19UcDhHVzRXcU42T3Nld1ZobG9SUVBCTjNZbEJyQmdCbmFXY3pxR0U0OE96cnl1Q0FQamRQeGhHdEJ4M0RuVDhxNU9LYzRTVVFJTGMtTVRUU2NuUFRMbGV5d0szTloxdmdPUTZyXzJfRVdqVWo4c2NnVTFTOTV3S0d2Z0oyczFHTmUxNzFoNzhXaFo1WmM3bWk5ZVhrelhsSTR1UU1DNndMVUNTZ2NNb01qZGNTekp1eE1pdnZoVUtMazZ2U2xYeTVWQ2Z4WTVMU2Z5eHk3THMwZ2VjS0J1VHdHb3FUTEZpVHJuaXpNYU1FT0NGRXJfTnA5WC1NMk9XaUZFQ1ZaZDMyQW56aWhyendwZHVxSGRfUFlwSlhpUkl3XzRVcTloZ0lmb0tHbjlqdzRJYnNvTHJ4aXFTM2VJR1FYRU94b1RrendGSzlZSk51Mmk0REhzME1yMDFCMFlZU1dteW03MDZoc25XS1hJMkhQQ3Q5bTlzdXVUUjZGRTRvdTBPck53MDMteHVEakI0OHhoSno1R2xudTZpODBZekpDSVRqenRDeUJfb24tSHF1LXd1bUQ3Sk15ZWFQamN5RzlhMXJHSmp1Yl9Ha2lCZ1hGTzczWnVyTjhvWkVNYnJwdzhRSGRueWNjZEZXRUNtYnJjQ3A0VFQ1dEtKa1VpSXhrN3ZlNGRmUnRhVzJZQUNYYnBkZ2Q4N0NkejdSX1ZxMXVtX0pCbV9IQzlqWWprTHo4RnU3RXZPckZFUElEdW1EcEJLNTZYcHRIXzZ0aXBOcVZObmFTV0hpSTJfeXRHRjBwc3pMYkNGb0hwUXlVdTVZWGVoZUNVdVFJbGppQnVtTFdUMEQ5V01vYlJJYmVfM05oOEs1VkstQ21wbEtmMGNnWkw1X2IzMk1UTmFuQmZEakE2My1oNkZrbkdQb21ibmRySGg5WFlsUnRXbjdrX2NidlVSdVNqY0VHLWJna0hkMHh6Ul82bXMzdnU5Z2dTUmZyajFHdllsTWNsSjE0cmtDbjRtTU56TnJkcUxaamJ1YUladHJ6RHRYYnNJQml5ZkdpNVhDbTlxamFvR3hzX0cyeEFnSlJ1UV8tWnZVeGNOWEV6dFZFdUgtaF9mUkM2UExXOERGc1NpNDVpbXVBQS1TZURmdzR3VTlYY2Q0N2JOVnJuaTVmNWpjTGJ6SldPZXEybFc2VjF4UjRwSjFOdnU4RzExYzlGa2pIR2owWGN2X1NSRk5leEU0aXF0VEZVRk1KSE5HMnBSdVpLbnNjTEM1aTVUbGpKeFE3UXFBYjBaWmZSNWloRWtRSnJWOU9vWFJhV18tdXF3WGlwaTZzT3JTbldvZ1poei1ZY1pGQXpFS2xHQ2RiVXVhb29oRnBtTndNY1hXWjdOM0xObGJ1UTdsZ1hFeDB2MExiYmMwUndPTVlIM3V1VXNzUlUxQkpqWUJZdzlBVnRfaDFfWS1yNmhtMVZZS0RDcWlPOXNPNGFuRzdXelFxcEVJcEwtRmlvMWZtdlNCSDdGaW5pS1lEREdsS0RxRjhpUWVwcEJDcGxaSnhWQVNrT3NfT1J2TGlDeFhOS2JQN25JUkFDY21kWVJCbmJ5X3JDY2w3Q3hRbVlLM0tWQjdzb1FRZVlWR3JMeUVLLWxzMzh3TnNMWnlXWXgxbVM5MDN2cl9zcExFVDY4Z3cxWElhUWlrMWd1T2RJNHpPamc5Rk9QdllDMGJTb3JtUHZ0Q0txdW1DSjdiUWdVSGVOU0dXMlM4OUszWnpsUjlsUl96ZWtJNGtONjQzZFU3cFZCdkZsMHo0QkJHVTlUbGlWM2x5cTBMYlhUaElONmc0OHUxdXdjX1VPN21HVXN3LVVJMlwiLFwiaXZcIjpcIk5QdG4yS2o5OFlsbzg4REZcIixcInByb3RlY3RlZFwiOlwiZXlKaGJHY2lPaUpCVjFOZlJVNURYMU5FUzE5Qk1qVTJJaXdpWlc1aklqb2lRVEkxTmtkRFRTSXNJbnBwY0NJNklrUkZSaUo5XCIsXCJyZWNpcGllbnRzXCI6W3tcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUJBSGlPUmNhUGhfaVozbWtqRW1RNkFoekM5Ykd4ckk0X1doVkhYNU9GcjY1ODBRSGFCSUw0R2lJZ1YzRDNLa1gycmVYQ0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTVVfNXltS2RGNmpXc1V6S25BZ0VRZ0R0OGRmWmFwWlNlTVJ4cGtPYWlSLXM0c3FXT0hJT0lic0EyRnkzMk5oQllZWjNRbVR0dzJacEZrMXBGSUFFOVpLQVg5dW53RXQyRktuMFFYQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczp1cy1lYXN0LTE6ODk4MDg0OTg0ODc3OmtleS8xZmZmNjc5My02NWRlLTQ3YzQtYjc2Ni03NmNkM2MxOTRlZjFcIn19LHtcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUJBSGlPUmNhUGhfaVozbWtqRW1RNkFoekM5Ykd4ckk0X1doVkhYNU9GcjY1ODBRSGFCSUw0R2lJZ1YzRDNLa1gycmVYQ0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTVVfNXltS2RGNmpXc1V6S25BZ0VRZ0R0OGRmWmFwWlNlTVJ4cGtPYWlSLXM0c3FXT0hJT0lic0EyRnkzMk5oQllZWjNRbVR0dzJacEZrMXBGSUFFOVpLQVg5dW53RXQyRktuMFFYQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczp1cy1lYXN0LTE6ODk4MDg0OTg0ODc3OmtleS8xZmZmNjc5My02NWRlLTQ3YzQtYjc2Ni03NmNkM2MxOTRlZjFcIn19LHtcImVuY3J5cHRlZF9rZXlcIjpcIkFRSUNBSGhXdmNIWHJNWVdVeWRWcmxnNXVNbF9RaUg2Z0t0c1puZFhCWDJqSG42OTN3RzE4UjFmaVRwTFYxbVRnb1RfS1BNM0FBQUFmakI4QmdrcWhraUc5dzBCQndhZ2J6QnRBZ0VBTUdnR0NTcUdTSWIzRFFFSEFUQWVCZ2xnaGtnQlpRTUVBUzR3RVFRTTlRbExOa2xDWXdHTUpvVWxBZ0VRZ0RzWW5BUDdkNTNWWXk3cGswRW9jaXlvamE0NDhLQTgwZExETzVVX0hCNk9CMDg2eUl6NGtmdlJYTzJ2ZFVUN09wU2xNeTM0b2F6RGgzckkwQVwiLFwiaGVhZGVyXCI6e1wia2lkXCI6XCJhcm46YXdzOmttczpldS13ZXN0LTE6ODk4MDg0OTg0ODc3OmtleS82YTg2NTkyZi0zNjE0LTQ4MTYtOWM0My04MTFiZWZkYTZkNjBcIn19XSxcInRhZ1wiOlwiM3VEV0Y2RDhzTmNDV2EyeThjRDdxQVwifSIsInJzMSI6ImI5MTE2ZjMzMmVmNzAyYjkyN2FhMWIyMjM1ZWYzM2FiZWI3OTVjZGYiLCJhdWQiOiIzODMwNThmNTAwZTQ0OTgwYTU3NmE3ZTAyYjBiMTg3Zjg4NTZlN2FjIiwiaXNzIjoiaHR0cHM6Ly9pZGVudGl0eS5jaWFtLnJlZmluaXRpdi5jb20vYXBpL2lkZW50aXR5L3N0c19wcm9kIiwiZXhwIjoxNjgwMDIxNjM5LCJpYXQiOjE2ODAwMjEwMzl9.He3VKCIicLFn9AdC6LEkmNW-8-C9dfscpDZnJjn_7Z-7D6vtzfVeYPBcOHMgqQX44b8QK2H_yP5-5eXLzYp41kVq1TrSxjui9TJn56ukLkB330pTpR85uJbepgxPFRLLj8pYy1FqWWzcpSUtiz7FE1ZNGpz85NZJx44KrTkkCGyO-gnTdltRb3s7UWy0LN3LGmZ0fQDPIk4HMUV2SXRzfpnsu54FbgR8aM03w_L-D4_VXCNLpidAddNs9R2YzafyNnSquCZ3BN7xKGuePCfiRxnPYde5d__vCvOF7we5nZeisl5HhUVW3q7_xf6lgOT3A3vrWRlgFVcdw766MXnC5w. Expire in 600.0 seconds
[2023-03-28 18:31:24,838] - [DEBUG] - [sessions.platform.rdp.4] - [33140] | AuthManager-Thread
AuthManager: Refresh token will be requested in 299 seconds
[2023-03-28 18:31:24,838] - [DEBUG] - [sessions.platform.rdp.4] - [2976] | MainThread
AuthManager: end authorize, result True
[2023-03-28 18:31:24,841] - [DEBUG] - [sessions.platform.rdp.4] - [2976] | MainThread
Opened session
[2023-03-28 18:31:24,847] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
DesktopSession created with following parameters: app_key="9afd663988954069a4c4140d944e01c3a568e9a9", name="workspace" base_url="http://localhost:9000" platform_path_rdp="/api/rdp" platform_path_udf="/api/udf" handshake_url="/api/handshake"
[2023-03-28 18:31:24,849] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
+ Session created: <refinitiv.data.session.Definition object at 0x27020df0 {name='workspace'}>
[2023-03-28 18:31:24,850] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Open session
[2023-03-28 18:31:24,896] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Created session connection SessionCxnType.DESKTOP
[2023-03-28 18:31:24,902] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Request to http://localhost:9001/api/status
method = GET
headers = {'x-tr-applicationid': '9afd663988954069a4c4140d944e01c3a568e9a9'}
params = None
cookies = None
data = None
json = None
[2023-03-28 18:31:26,969] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
HTTP request response 200: {"statusCode":"ST_PROXY_READY","version":"3.4.2"}
[2023-03-28 18:31:26,971] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Checking proxy url http://localhost:9001/api/status response : 200 - {"statusCode":"ST_PROXY_READY","version":"3.4.2"}
[2023-03-28 18:31:26,972] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Port 9001 was retrieved from .portInUse file
[2023-03-28 18:31:26,973] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Try to handshake on url http://localhost:9001/api/handshake...
[2023-03-28 18:31:26,974] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Request to http://localhost:9001/api/handshake
method = POST
headers = {'Content-Type': 'application/json', 'x-tr-applicationid': '9afd663988954069a4c4140d944e01c3a568e9a9'}
params = None
cookies = None
data = None
json = {'AppKey': '9afd663988954069a4c4140d944e01c3a568e9a9', 'AppScope': 'trapi', 'ApiVersion': '1', 'LibraryName': 'RDP Python Library', 'LibraryVersion': '1.0.0b24'}
[2023-03-28 18:31:26,983] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
HTTP request response 200: {"access_token":"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJBcGlWZXJzaW9uIjoiMSIsIkFwcEtleSI6IjlhZmQ2NjM5ODg5NTQwNjlhNGM0MTQwZDk0NGUwMWMzYTU2OGU5YTkiLCJBcHBTY29wZSI6InRyYXBpIiwiTGlicmFyeU5hbWUiOiJSRFAgUHl0aG9uIExpYnJhcnkiLCJMaWJyYXJ5VmVyc2lvbiI6IjEuMC4wYjI0IiwiaWF0IjoxNjgwMDIxMDg2LCJleHAiOjE2ODEyMzA2ODZ9.1zMH_dZuCxyMKpxq2xG3CkJ2jTGmXUB6qkSQu2BCc1o","expires_in":1209600,"token_type":"bearer"}
[2023-03-28 18:31:26,984] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Response : 200 - {"access_token":"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJBcGlWZXJzaW9uIjoiMSIsIkFwcEtleSI6IjlhZmQ2NjM5ODg5NTQwNjlhNGM0MTQwZDk0NGUwMWMzYTU2OGU5YTkiLCJBcHBTY29wZSI6InRyYXBpIiwiTGlicmFyeU5hbWUiOiJSRFAgUHl0aG9uIExpYnJhcnkiLCJMaWJyYXJ5VmVyc2lvbiI6IjEuMC4wYjI0IiwiaWF0IjoxNjgwMDIxMDg2LCJleHAiOjE2ODEyMzA2ODZ9.1zMH_dZuCxyMKpxq2xG3CkJ2jTGmXUB6qkSQu2BCc1o","expires_in":1209600,"token_type":"bearer"}
[2023-03-28 18:31:26,986] - [DEBUG] - [sessions.desktop.workspace.5] - [2976] | MainThread
Opened session
def Chunks(lst, n):
"""Yield successive n-sized chunks from lst."""
for i in range(0, len(lst), n):
yield lst[i:i + n]
HSI_test0 = rd.content.historical_pricing.summaries.Definition(
'STXE42000D3.EX',
interval=rd.content.historical_pricing.Intervals.DAILY,
fields=['SETTLE'],
start='2022-11-07',
end='2023-02-01').get_data().data.df
hk_rf = 100 - rd.get_history(
universe=['HK3MT=RR'], # HK10YGB=EODF, HKGOV3MZ=R, HK3MT=RR
fields=['TR.MIDPRICE'],
start=HSI_test0.index[0].strftime('%Y-%m-%d'),
end=HSI_test0.index[-1].strftime('%Y-%m-%d'))
HSI_test1 = pd.merge(
HSI_test0, hk_rf, left_index=True, right_index=True)
HSI_test1 = HSI_test1.rename(
columns={"SETTLE": "OptionPrice", "Mid Price": "RfRatePrct"})
hist_HSI_undrlying_pr = rd.get_history(
universe=['.HSI'],
fields=["TRDPRC_1"],
# interval="1D",
start=HSI_test0.index[0].strftime('%Y-%m-%d'),
end=HSI_test0.index[-1].strftime('%Y-%m-%d'))
HSI_test2 = pd.merge(HSI_test1, hist_HSI_undrlying_pr,
left_index=True, right_index=True)
HSI_test2 = HSI_test2.rename(
columns={"TRDPRC_1": "UndrlyingPr"})
HSI_test2.columns.name = 'STXE42000D3.EX'
HSI_test2
| STXE42000D3.EX | OptionPrice | RfRatePrct | UndrlyingPr |
|---|---|---|---|
| Date | |||
| 2022-11-09 | 37.8 | 0.7365 | 16358.52 |
| 2022-11-10 | 59.5 | 0.731 | 16081.04 |
| 2022-11-11 | 65.2 | 0.713 | 17325.66 |
| 2022-11-14 | 66.1 | 0.71 | 17619.71 |
| 2022-11-15 | 71.5 | 0.7965 | 18343.12 |
| ... | ... | ... | ... |
| 2023-01-26 | 112.8 | 0.577 | 22566.78 |
| 2023-01-27 | 109.9 | 0.532 | 22688.9 |
| 2023-01-30 | 104.4 | 0.523 | 22069.73 |
| 2023-01-31 | 104.7 | 0.5745 | 21842.33 |
| 2023-02-01 | 106.4 | 0.5925 | 22072.18 |
61 rows × 3 columns
requestFields = ['MarketValueInDealCcy', 'RiskFreeRatePercent', 'UnderlyingPrice', 'PricingModelType', 'DividendType', 'UnderlyingTimeStamp', 'ReportCcy', 'VolatilityType', 'Volatility', 'DeltaPercent', 'GammaPercent', 'RhoPercent', 'ThetaPercent', 'VegaPercent']
live_universe = [
{
"instrumentType": "Option",
"instrumentDefinition": {
"buySell": "Buy",
"underlyingType": "Eti",
"instrumentCode": 'STXE42000D3.EX',
"strike": float(4200),
},
"pricingParameters": {
"marketValueInDealCcy": float(HSI_test2['OptionPrice'][i]),
"riskFreeRatePercent": float(HSI_test2['RfRatePrct'][i]),
"underlyingPrice": float(HSI_test2['UndrlyingPr'][i]),
"pricingModelType": "BlackScholes",
"dividendType": "ImpliedYield",
"volatilityType": "Implied",
"underlyingTimeStamp": "Default",
"reportCcy": "HKD"
}
}
for i in range(len(HSI_test2.index))]
batchOf = 100
for i, j in enumerate(Chunks(live_universe, 100)):
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(live_universe, batchOf)]))} started")
# Example request with Body Parameter - Symbology Lookup
live_troubleshoot_request_definition = rd.delivery.endpoint_request.Definition(
method=rd.delivery.endpoint_request.RequestMethod.POST,
url='https://api.refinitiv.com/data/quantitative-analytics/v1/financial-contracts',
body_parameters={"fields": requestFields,
"outputs": ["Data", "Headers"],
"universe": j})
live_troubleshoot_resp = live_troubleshoot_request_definition.get_data()
headers_name = [h['name'] for h in live_troubleshoot_resp.data.raw['headers']]
if i == 0:
live_troubleshoot_df = pd.DataFrame(
data=live_troubleshoot_resp.data.raw['data'],
columns=headers_name)
else:
_live_troubleshoot_df = pd.DataFrame(
data=live_troubleshoot_resp.data.raw['data'],
columns=headers_name)
live_troubleshoot_df = live_troubleshoot_df.append(_live_troubleshoot_df, ignore_index=True)
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(live_universe, batchOf)]))} ended")
Batch of 100 requests no. 1/1 started Batch of 100 requests no. 1/1 ended
live_troubleshoot_df
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 37.8 | 0.7365 | 16358.52 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 1 | 59.5 | 0.7310 | 16081.04 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 2 | 65.2 | 0.7130 | 17325.66 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 3 | 66.1 | 0.7100 | 17619.71 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 4 | 71.5 | 0.7965 | 18343.12 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56 | 112.8 | 0.5770 | 22566.78 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 57 | 109.9 | 0.5320 | 22688.90 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 58 | 104.4 | 0.5230 | 22069.73 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 59 | 104.7 | 0.5745 | 21842.33 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
| 60 | 106.4 | 0.5925 | 22072.18 | BlackScholes | ImpliedYield | Default | HKD | Calculated | None | NaN | NaN | NaN | NaN | NaN |
61 rows × 14 columns
hist_universe = [
{
"instrumentType": "Option",
"instrumentDefinition": {
"buySell": "Buy",
"underlyingType": "Eti",
"instrumentCode": 'HSI19300N3.HF^B23',
"strike": float(4200),
},
"pricingParameters": {
"marketValueInDealCcy": float(HSI_test2['OptionPrice'][i]),
"riskFreeRatePercent": float(HSI_test2['RfRatePrct'][i]),
"underlyingPrice": float(HSI_test2['UndrlyingPr'][i]),
"pricingModelType": "BlackScholes",
"dividendType": "ImpliedYield",
"volatilityType": "Implied",
"underlyingTimeStamp": "Default",
"reportCcy": "HKD"
}
}
for i in range(len(HSI_test2.index))]
for i, j in enumerate(Chunks(hist_universe, 100)):
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(hist_universe, batchOf)]))} started")
# Example request with Body Parameter - Symbology Lookup
hist_troubleshoot_request_definition = rd.delivery.endpoint_request.Definition(
method=rd.delivery.endpoint_request.RequestMethod.POST,
url='https://api.refinitiv.com/data/quantitative-analytics/v1/financial-contracts',
body_parameters={"fields": requestFields,
"outputs": ["Data", "Headers"],
"universe": j})
hist_troubleshoot_resp = hist_troubleshoot_request_definition.get_data()
headers_name = [h['name'] for h in hist_troubleshoot_resp.data.raw['headers']]
if i == 0:
hist_troubleshoot_df = pd.DataFrame(
data=hist_troubleshoot_resp.data.raw['data'],
columns=headers_name)
else:
_hist_troubleshoot_df = pd.DataFrame(
data=hist_troubleshoot_resp.data.raw['data'],
columns=headers_name)
hist_troubleshoot_df = hist_troubleshoot_df.append(_hist_troubleshoot_df, ignore_index=True)
print(f"Batch of {batchOf} requests no. {str(i+1)}/{str(len([i for i in Chunks(hist_universe, batchOf)]))} ended")
Batch of 100 requests no. 1/1 started Batch of 100 requests no. 1/1 ended
hist_troubleshoot_df
| MarketValueInDealCcy | RiskFreeRatePercent | UnderlyingPrice | PricingModelType | DividendType | UnderlyingTimeStamp | ReportCcy | VolatilityType | Volatility | DeltaPercent | GammaPercent | RhoPercent | ThetaPercent | VegaPercent | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 1 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 2 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 3 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 4 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 57 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 58 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 59 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
| 60 | None | None | None | None | None | None | None | None | None | None | None | None | None | None |
61 rows × 14 columns
live_hist_daily_universe_l = [
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code='STXE42000D3.EX', # 'STXE42000D3.EX' # 'HSI19300N3.HF^B23', # list(HSI_test2['valid_ric'][0].keys())[0],
strike=float(4200),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(HSI_test2['OptionPrice'][i]),
risk_free_rate_percent=float(HSI_test2['RfRatePrct'][i]),
underlying_price=float(HSI_test2['UndrlyingPr'][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='HKD'
))
for i in range(len(HSI_test2.index))]
for i, j in enumerate(Chunks(live_hist_daily_universe_l, 100)):
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(hist_daily_universe_l, 100)])} started")
# Example request with Body Parameter - Symbology Lookup
troubleshoot_resp_live = rdf.Definitions(universe=j, fields=requestFields)
troubleshoot_resp_live_getdata = troubleshoot_resp_live.get_data()
if i == 0:
troubleshoot_resp_live_df = troubleshoot_resp_live_getdata.data.df
else:
troubleshoot_resp_live_df = troubleshoot_resp_live_df.append(
troubleshoot_resp_live_getdata.data.df, ignore_index=True)
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(hist_daily_universe_l, 100)])} ended")
--------------------------------------------------------------------------- NameError Traceback (most recent call last) Input In [18], in <cell line: 1>() 1 for i, j in enumerate(Chunks(live_hist_daily_universe_l, 100)): ----> 2 print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(hist_daily_universe_l, 100)])} started") 3 # Example request with Body Parameter - Symbology Lookup 4 troubleshoot_resp_live = rdf.Definitions(universe=j, fields=requestFields) NameError: name 'hist_daily_universe_l' is not defined
troubleshoot_resp_live_df
exp_hist_daily_universe_l = [
option.Definition(
underlying_type=option.UnderlyingType.ETI,
buy_sell='Buy',
instrument_code='HSI19300N3.HF^B23', # 'STXE42000D3.EX' # 'HSI19300N3.HF^B23'
strike=float(hist_opt_found_strk_pr),
pricing_parameters=option.PricingParameters(
market_value_in_deal_ccy=float(HSI_test2['OptionPrice'][i]),
risk_free_rate_percent=float(HSI_test2['RfRatePrct'][i]),
underlying_price=float(HSI_test2['UndrlyingPr'][i]),
pricing_model_type='BlackScholes',
volatility_type='Implied',
underlying_time_stamp='Default',
report_ccy='HKD'
))
for i in range(len(HSI_test2.index))]
for i, j in enumerate(Chunks(exp_hist_daily_universe_l, 100)):
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(exp_hist_daily_universe_l, 100)])} started")
# Example request with Body Parameter - Symbology Lookup
troubleshoot_resp_exp = rdf.Definitions(universe=j, fields=requestFields)
troubleshoot_resp_exp_getdata = troubleshoot_resp_exp.get_data()
if i == 0:
troubleshoot_resp_exp_df = troubleshoot_resp_exp_getdata.data.df
else:
troubleshoot_resp_exp_df = troubleshoot_resp_exp_df.append(
troubleshoot_resp_exp.data.df, ignore_index=True)
print(f"Batch of {len(j)} requests no. {i+1}/{len([i for i in Chunks(exp_hist_daily_universe_l, 100)])} ended")
troubleshoot_resp_exp_df